grimjim/mistralai-Mistral-Nemo-Base-2407
TEXT GENERATIONConcurrency Cost:1Model Size:12BQuant:FP8Ctx Length:32kPublished:Aug 3, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

The Mistral-Nemo-Base-2407 is a 12 billion parameter generative text model developed jointly by Mistral AI and NVIDIA. It features a 128k context window and was trained on a substantial proportion of multilingual and code data. This model is designed as a drop-in replacement for Mistral 7B, offering enhanced performance for various text generation tasks, particularly excelling in multilingual understanding and code-related applications.

Loading preview...