wvnvwn/llama-2-13b-chat-hf-lr5e-5-safedelta-scale0.1
TEXT GENERATIONConcurrency Cost:1Model Size:13BQuant:FP8Ctx Length:4kPublished:May 3, 2026Architecture:Transformer Cold
The wvnvwn/llama-2-13b-chat-hf-lr5e-5-safedelta-scale0.1 is a 13 billion parameter language model based on the Llama 2 architecture, fine-tuned for chat applications. With a context length of 4096 tokens, this model is designed for conversational AI tasks. Its primary use case is to serve as a foundational model for developing interactive chat experiences.
Loading preview...
Overview
This model, wvnvwn/llama-2-13b-chat-hf-lr5e-5-safedelta-scale0.1, is a 13 billion parameter language model built upon the Llama 2 architecture. It has been specifically fine-tuned for chat-based applications, making it suitable for interactive conversational AI. The model supports a context length of 4096 tokens, allowing for moderately long conversations.
Key Capabilities
- Conversational AI: Optimized for generating human-like responses in chat scenarios.
- Llama 2 Foundation: Benefits from the robust and widely-used Llama 2 base architecture.
- 13 Billion Parameters: Offers a balance between performance and computational requirements for various applications.
Good for
- Developing chatbots and virtual assistants.
- Prototyping conversational interfaces.
- Applications requiring a general-purpose chat model with a reasonable context window.