Mabeck/Heidrun-Mistral-7B-chat
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 13, 2024License:mitArchitecture:Transformer0.0K Open Weights Cold
Heidrun-Mistral-7B-chat is a 7 billion parameter chat model developed by Mabeck, fine-tuned from Heidrun-Mistral-7B-base with a 4096-token context length. It is specifically optimized for Danish language tasks, demonstrating strong performance in logic and reasoning. This model ranks as a leading open-source Danish LLM on the ScandEval benchmark, making it ideal for applications requiring high-quality Danish language understanding and generation.
Loading preview...