bofenghuang/vigogne-2-13b-chat
Vigogne-2-13B-Chat is a 13 billion parameter French chat LLM, based on Meta's LLaMA-2-13B architecture, developed by bofenghuang. This model is specifically optimized to generate helpful and coherent responses in French conversations, making it ideal for French-language conversational AI applications. It leverages a context length of 4096 tokens and is fine-tuned for interactive chat scenarios.
Loading preview...
Vigogne-2-13B-Chat: A French Conversational LLM
Vigogne-2-13B-Chat is a 13 billion parameter large language model, built upon Meta's LLaMA-2-13B architecture, and developed by bofenghuang. Its primary focus is on generating helpful and coherent responses in French-language conversations. The model is designed for chat applications, utilizing a specific prompt template with <user>: and <assistant>: tokens to differentiate turns, which can be applied using Hugging Face's apply_chat_template() method.
Key Capabilities
- French Chat Optimization: Specifically fine-tuned for conversational interactions in French.
- Llama-2 Base: Benefits from the robust architecture of LLaMA-2-13B.
- Context Length: Supports a context window of 4096 tokens, suitable for multi-turn conversations.
- Ease of Use: Provides clear examples and a Google Colab notebook for inference with Hugging Face Transformers and vLLM.
Important Considerations
- License: Adheres to Llama-2's usage policy.
- Training Data: A significant portion of its training data is distilled from GPT-3.5-Turbo and GPT-4, requiring cautious use to avoid violating OpenAI's terms of use.
- Limitations: As an ongoing development, the model may occasionally generate harmful, biased, incorrect, or unhelpful content.
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.