flozi00/Mistral-7B-german-assistant-v4
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Oct 26, 2023Architecture:Transformer0.0K Cold

flozi00/Mistral-7B-german-assistant-v4 is a 7 billion parameter language model based on the Mistral v0.1 architecture, fine-tuned specifically for German instruction following and conversational tasks. It supports a context length of 8k tokens and was trained on a deduplicated, cleaned, and uncensored dataset. This model excels at generating German responses in an Alpaca-style format, making it suitable for German-language assistant applications.

Loading preview...