flammenai/Mahou-1.3-mistral-nemo-12B
TEXT GENERATIONConcurrency Cost:1Model Size:12BQuant:FP8Ctx Length:32kLicense:apache-2.0Architecture:Transformer0.0K Open Weights Warm

flammenai/Mahou-1.3-mistral-nemo-12B is a 12 billion parameter language model developed by flammenai, built on the Mistral-Nemo architecture with a 32K context length. This model is specifically designed for conversational AI, excelling at generating short messages in casual conversation and character roleplay scenarios. It is fine-tuned using the ORPO method to enhance its interactive dialogue capabilities.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p