simonlesaumon/Mistral-NeMo-12B-Unslopper-FR-v1
TEXT GENERATIONConcurrency Cost:1Model Size:12BQuant:FP8Ctx Length:32kPublished:Jan 31, 2026Architecture:Transformer0.0K Cold
The simonlesaumon/Mistral-NeMo-12B-Unslopper-FR-v1 is a 12 billion parameter Mistral-NeMo instruction-tuned model, fine-tuned and converted to GGUF format using Unsloth. It supports a 32768 token context length and is optimized for deployment with llama.cpp and Ollama. This model is designed for general text generation tasks, particularly in French, leveraging efficient training and conversion methods.
Loading preview...