DiscoResearch/DiscoLM_German_7b_v1
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 14, 2024License:apache-2.0Architecture:Transformer0.1K Open Weights Cold

DiscoLM German 7b v1 is a 7 billion parameter, Mistral-based large language model developed by DiscoResearch, specifically optimized for German-language applications. It was fine-tuned using SFT and DPO on a large dataset of German and English instructions, excelling in German text understanding, generation, and translation tasks. While maintaining English fluency, its primary strength lies in providing robust and reliable German language output for everyday use cases.

Loading preview...