CompassioninMachineLearning/pretrainingBasellama3kv3
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Jan 19, 2026License:apache-2.0Architecture:Transformer Open Weights Cold
CompassioninMachineLearning/pretrainingBasellama3kv3 is an 8 billion parameter Llama-based language model developed by CompassioninMachineLearning. This model was pre-trained and optimized for efficiency, utilizing Unsloth and Huggingface's TRL library for 2x faster training. It offers a 32768 token context length, making it suitable for applications requiring processing of longer sequences.
Loading preview...