krutrim-ai-labs/Krutrim-2-instruct
TEXT GENERATIONConcurrency Cost:1Model Size:12BQuant:FP8Ctx Length:32kPublished:Feb 2, 2025License:krutrim-community-license-agreement-version-1.0Architecture:Transformer0.0K Cold
Krutrim-2-instruct is a 12 billion parameter instruction-tuned language model developed by the OLA Krutrim team, built on the Mistral-NeMo 12B architecture. It supports a 128K token context window and is trained across diverse domains including web data, code, math, and Indic languages. This model delivers best-in-class performance on Indic tasks and competitive results on English benchmarks, excelling in multilingual generation and Indian cultural context relevance.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
–
top_p
–
top_k
–
frequency_penalty
–
presence_penalty
–
repetition_penalty
–
min_p
–