norallm/normistral-11b-long
TEXT GENERATIONConcurrency Cost:1Model Size:12BQuant:FP8Ctx Length:32kPublished:Dec 8, 2025License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

NorMistral-11b-long is a 11.4 billion parameter causal language model developed by the Language Technology Group at the University of Oslo (LTG) as part of the NORA.LLM family. This model is a length-extended version of NorMistral-11b-warm, featuring an increased context length of 32,768 tokens. It was continually trained on 50 billion subword tokens, including Scandinavian, Sámi, English, and code data, making it particularly adept for research involving Norwegian and Sámi languages.

Loading preview...