hkust-nlp/deita-7b-v1.0
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Dec 20, 2023License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Deita 7B V1.0 by hkust-nlp is a 7 billion parameter language model fine-tuned from Mistral-7B-v0.1, specifically optimized for instruction following through automatic data selection. It leverages 6K automatically selected SFT data and 10K randomly sampled DPO data, demonstrating strong performance on benchmarks like MT-Bench and AlpacaEval. This model is designed for efficient and high-quality alignment in LLMs, making it suitable for applications requiring robust instruction-tuned responses.

Loading preview...