sreemanspl2/llama3-8b-acme-cpq-merged
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Dec 17, 2025Architecture:Transformer Cold

The sreemanspl2/llama3-8b-acme-cpq-merged model is an 8 billion parameter language model, likely based on the Llama 3 architecture, with a substantial context length of 32768 tokens. This model is a merged version, indicating it combines features or training from multiple sources to enhance its capabilities. Its primary application is expected to be in complex natural language understanding and generation tasks, leveraging its large parameter count and extended context window for improved coherence and detail.

Loading preview...