danish-foundation-models/munin-7b-alpha
Munin 7B Alpha is a 7 billion parameter generative text model developed by the Danish Foundation Models Team. Based on Mistral-7B-v0.1, it has undergone continual pretraining on the Danish Gigaword dataset. This model is specifically designed to serve as a foundational large language model for Danish language applications. It is currently an alpha release intended for research and development purposes.
Loading preview...
Munin 7B Alpha: A Danish Foundational LLM
Munin 7B Alpha is a 7 billion parameter large language model developed by the Danish Foundation Models Team. It is built upon the architecture of Mistral-7B-v0.1 and has been continually pretrained on the Danish Gigaword dataset, making it specialized for the Danish language.
Key Characteristics
- Architecture: Based on Mistral-7B-v0.1.
- Parameters: 7 billion parameters.
- Training: Continual pretraining on the Danish Gigaword dataset.
- Language Focus: Primarily developed for Danish language processing.
- Development Status: Currently an Alpha model, not recommended for production use.
Intended Use
This model is released as an alpha version for research and development within Danish language applications. Developers and researchers are encouraged to experiment with it and provide feedback. As a pretrained base model, it lacks inherent moderation mechanisms, similar to Mistral 7B.
For more details, refer to the release blog post and the codebase on GitHub.