darkc0de/Xortron24DPO
TEXT GENERATIONConcurrency Cost:2Model Size:24BQuant:FP8Ctx Length:32kPublished:Apr 16, 2025License:apache-2.0Architecture:Transformer0.0K Open Weights Cold
darkc0de/Xortron24DPO is a 24 billion parameter Mistral-based language model, fine-tuned from TroyDoesAI/BlackSheep-24B. Developed by darkc0de, this model was trained using Unsloth and Huggingface's TRL library, achieving a 2x faster training speed. It is designed for general language tasks, leveraging its Mistral architecture for efficient processing.
Loading preview...