Mistral-Nemo-Base-2407 is a 12 billion parameter pretrained generative text model developed jointly by Mistral AI and NVIDIA. It features a 128k context window and is trained on a significant proportion of multilingual and code data. This model is designed as a drop-in replacement for Mistral 7B, offering enhanced performance for various natural language processing tasks.
No reviews yet. Be the first to review!