mistralai/Mistral-Large-Instruct-2411
TEXT GENERATIONConcurrency Cost:4Model Size:123BQuant:FP8Ctx Length:32kPublished:Nov 14, 2024License:mrlArchitecture:Transformer0.3K Warm

Mistral-Large-Instruct-2411 is an advanced 123 billion parameter dense Large Language Model developed by Mistral AI, featuring a 128k token context window. It excels in reasoning, knowledge, and coding across dozens of languages, including Python, Java, and C++. This model is particularly optimized for agentic capabilities with native function calling, robust context adherence for RAG, and improved system prompt handling, making it suitable for complex, multi-turn conversational AI and automated task execution.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p