norallm/normistral-11b-warm
TEXT GENERATIONConcurrency Cost:1Model Size:12BQuant:FP8Ctx Length:32kPublished:Sep 26, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Warm

NorMistral-11b-warm is a 12 billion parameter Norwegian language model developed by the Language Technology Group at the University of Oslo (LTG) as part of the NORA.LLM family. Initialized from Mistral-Nemo-Base-2407, it underwent continual pretraining on 250 billion subword tokens, including a mix of Scandinavian, Sámi, English, and code data. This model is specifically optimized for Norwegian and other Scandinavian languages, featuring a new tokenizer for faster inference and hybrid masked-causal training, making it suitable for both causal generative and bidirectional encoder tasks.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p