mistralai/Mistral-Nemo-Base-2407
TEXT GENERATIONConcurrency Cost:1Model Size:12BQuant:FP8Ctx Length:32kPublished:Jul 18, 2024License:apache-2.0Architecture:Transformer0.3K Open Weights Warm
Mistral-Nemo-Base-2407 is a 12 billion parameter pretrained generative text model developed jointly by Mistral AI and NVIDIA. It features a 128k context window and is trained on a significant proportion of multilingual and code data. This model is designed as a drop-in replacement for Mistral 7B, offering enhanced performance for various natural language processing tasks.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
top_p
top_k
frequency_penalty
–
presence_penalty
–
repetition_penalty
min_p
–