norallm/normistral-7b-warm-instruct
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Apr 5, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Warm
norallm/normistral-7b-warm-instruct is a 7 billion parameter instruction-tuned causal language model developed by norallm, based on the Mistral architecture. It is continuously pretrained on 260 billion subword tokens of Norwegian texts and instruction-tuned on a filtered, augmented, and translated corpus of open datasets, including Norwegian Bokmål and Nynorsk. This model is designed for commercial applications due to its permissive Apache-2.0 license and features a 4096 token context length, making it particularly strong for Norwegian language tasks and multi-turn conversations.
Loading preview...