myaniu/Vicuna-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kLicense:apache-2.0Architecture:Transformer0.0K Open Weights Cold

The myaniu/Vicuna-7B model is a 7 billion parameter language model derived from the LLaMA architecture, further fine-tuned with Vicuna weights (vicuna-7b-delta-v1.1). This model is designed for general-purpose conversational AI, leveraging the combined strengths of its base models. It offers a 4096-token context length, making it suitable for various interactive applications.

Loading preview...