grimjim/mistralai-Mistral-Nemo-Instruct-2407
TEXT GENERATIONConcurrency Cost:1Model Size:12BQuant:FP8Ctx Length:32kPublished:Jul 22, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

The Mistral-Nemo-Instruct-2407 is a 12 billion parameter instruction-tuned large language model developed jointly by Mistral AI and NVIDIA. It features a 128k context window and is trained on a large proportion of multilingual and code data, making it a versatile model for various applications. This model significantly outperforms existing models of similar or smaller size, offering a drop-in replacement for Mistral 7B. It is released under the Apache 2 License and excels in general language understanding and generation tasks.

Loading preview...