flozi00/Mistral-7B-german-assistant-v3

TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kArchitecture:Transformer0.0K Cold

The flozi00/Mistral-7B-german-assistant-v3 is a 7 billion parameter Mistral v0.1-based model, fine-tuned for German instruction following and conversational tasks. It utilizes an 8k token context length and was trained on a deduplicated, cleaned dataset without code. This model excels at understanding and generating German responses in an assistant-like style.

Loading preview...

flozi00/Mistral-7B-german-assistant-v3 Overview

This model is a specialized fine-tuned version of the Mistral v0.1 architecture, featuring 7 billion parameters. Developed by flozi00, its primary focus is on German language instruction following and conversational interactions, adopting an Alpaca-style format with "### Assistant:" and "### User:" prompts.

Key Capabilities

  • German Instruction Following: Optimized to accurately understand and execute instructions provided in German.
  • Conversational AI: Designed for engaging in natural, assistant-like dialogues in German.
  • Extended Context: Supports an 8k token context length, allowing for more coherent and extended conversations.
  • Cleaned Training Data: Trained on a meticulously deduplicated and cleaned dataset, specifically excluding code, to enhance its linguistic capabilities.
  • Sustainable Training: Notably, the model was trained using hardware powered entirely by 100% renewable energy.

Good For

  • Applications requiring a robust German-speaking assistant.
  • Chatbots and conversational agents interacting with German users.
  • Tasks that benefit from strong instruction adherence in German.