malekgo/mistral-nemo-lp-ai
TEXT GENERATIONConcurrency Cost:1Model Size:12BQuant:FP8Ctx Length:32kPublished:Feb 21, 2026Architecture:Transformer Cold

The malekgo/mistral-nemo-lp-ai model is a 12 billion parameter language model fine-tuned from unsloth/mistral-nemo-instruct-2407-bnb-4bit. This model was trained using SFT with the TRL framework, specializing in instruction-following tasks. With a 32768 token context length, it is designed for generating coherent and contextually relevant text based on user prompts. Its fine-tuning process aims to enhance its ability to respond to diverse instructions effectively.

Loading preview...