TorieRingo/torie-mistral-7b
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 6, 2026License:apache-2.0Architecture:Transformer Open Weights Cold
TorieRingo/torie-mistral-7b is a 7 billion parameter Mistral-based causal language model developed by TorieRingo. This model was finetuned using Unsloth and Huggingface's TRL library, enabling 2x faster training. It is designed for general language tasks, leveraging the Mistral architecture's efficiency and performance.
Loading preview...