lars1234/Mistral-Small-24B-Instruct-2501-writer
TEXT GENERATIONConcurrency Cost:2Model Size:24BQuant:FP8Ctx Length:32kPublished:Mar 6, 2025License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Mistral-Small-24B-Instruct-2501-writer is a 24 billion parameter instruction-tuned causal language model, fine-tuned by lars1234 from mistralai/Mistral-Small-24B-Instruct-2501. Optimized for creative writing tasks, it demonstrates improved performance across various story writing metrics compared to its base model. This model excels in generating diverse and engaging narratives, making it suitable for applications requiring high-quality textual creativity.

Loading preview...