FinaPolat/RAGED_Mistral-Nemo
TEXT GENERATIONConcurrency Cost:1Model Size:12BQuant:FP8Ctx Length:32kPublished:Apr 15, 2026License:apache-2.0Architecture:Transformer Open Weights Cold
FinaPolat/RAGED_Mistral-Nemo is a 12 billion parameter Mistral-based causal language model developed by FinaPolat. This model was finetuned using Unsloth and Huggingface's TRL library, enabling 2x faster training. It is designed for general language tasks, leveraging its Mistral architecture and efficient finetuning process.
Loading preview...
Overview
FinaPolat/RAGED_Mistral-Nemo is a 12 billion parameter language model developed by FinaPolat. It is based on the Mistral architecture and was finetuned from unsloth/mistral-nemo-instruct-2407-bnb-4bit.
Key Characteristics
- Efficient Finetuning: This model was trained significantly faster, achieving 2x speed improvements, by utilizing Unsloth and Huggingface's TRL library.
- Mistral Foundation: Built upon the Mistral architecture, it inherits the strong general language understanding and generation capabilities of its base model.
Use Cases
This model is suitable for a variety of general-purpose language tasks where a 12 billion parameter model with efficient training is beneficial. Its Apache-2.0 license allows for broad usage.