norallm/normistral-7b-warm
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 4, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Warm

NorMistral-7b-warm is a 7 billion parameter Norwegian language model developed by the Language Technology Group at the University of Oslo, initialized from Mistral-7b-v0.1 and continuously pretrained on 260 billion subword tokens of Norwegian and code data. This model excels in Norwegian language understanding and generation tasks, including sentiment analysis, reading comprehension, grammatical error correction, and machine translation, outperforming several other models in its class for Norwegian-specific tasks. It is primarily intended for research purposes and is not instruction-tuned.

Loading preview...