XY26/Llama-3.1-8B-DeFramed
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Feb 15, 2026License:apache-2.0Architecture:Transformer Open Weights Cold
XY26/Llama-3.1-8B-DeFramed is an 8 billion parameter Llama-3.1 architecture model developed by XY26. This model was finetuned using Unsloth and Huggingface's TRL library, enabling 2x faster training. It is optimized for general language tasks, leveraging its efficient training methodology for improved performance.
Loading preview...