myersjayt/TwinLlama-3.1-8B-DPO
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Jan 21, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

myersjayt/TwinLlama-3.1-8B-DPO is an 8 billion parameter Llama-based language model developed by myersjayt, fine-tuned from myersjayt/TwinLlama-3.1-8B. This model was trained using Unsloth and Huggingface's TRL library, enabling 2x faster training. It is designed for general language generation tasks, leveraging its efficient training methodology.

Loading preview...