FinaPolat/Mistral-Nemo-Instruct-2407_openED

TEXT GENERATIONConcurrency Cost:1Model Size:12BQuant:FP8Ctx Length:32kPublished:Apr 13, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

FinaPolat/Mistral-Nemo-Instruct-2407_openED is a 12 billion parameter instruction-tuned Mistral-based language model developed by FinaPolat. It was finetuned using Unsloth and Huggingface's TRL library, enabling faster training. This model is designed for general instruction-following tasks, leveraging its Mistral architecture and optimized training process.

Loading preview...

Model Overview

FinaPolat/Mistral-Nemo-Instruct-2407_openED is a 12 billion parameter instruction-tuned model based on the Mistral architecture. Developed by FinaPolat, this model was finetuned from unsloth/mistral-nemo-instruct-2407-bnb-4bit.

Key Characteristics

  • Architecture: Mistral-based, designed for efficient instruction following.
  • Training Optimization: Finetuned using Unsloth and Huggingface's TRL library, which facilitated a 2x faster training process.
  • Parameter Count: 12 billion parameters, offering a balance between performance and computational requirements.
  • Context Length: Supports a context length of 32768 tokens, allowing for processing longer inputs and generating more extensive responses.

Intended Use Cases

This model is suitable for a variety of general instruction-following applications where a Mistral-based architecture with optimized training is beneficial. Its instruction-tuned nature makes it adept at understanding and executing user commands across different tasks.