ParetoQaft/8B-base
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Jan 10, 2026License:llama3.1Architecture:Transformer Cold

ParetoQaft/8B-base is an 8 billion parameter base model from the Meta Llama 3.1 collection, developed by Meta. This auto-regressive language model utilizes an optimized transformer architecture and supports a substantial 128k token context length. Trained on over 15 trillion tokens of publicly available online data with a December 2023 cutoff, it is designed for commercial and research use in multiple languages, excelling in natural language generation tasks and supporting multilingual text and code outputs.

Loading preview...