UCLA-AGI/zephyr-7b-sft-full-SPIN-iter3
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:8kPublished:Jan 7, 2024License:mitArchitecture:Transformer0.0K Open Weights Cold

UCLA-AGI/zephyr-7b-sft-full-SPIN-iter3 is a 7 billion parameter GPT-like language model developed by UCLA-AGI, fine-tuned using a self-play fine-tuning (SPIN) method at iteration 3. Based on Mistral-7B-v0.1, it leverages synthetic data from the HuggingFaceH4/ultrachat_200k dataset to enhance its performance. This model is primarily English-language and is designed for general-purpose natural language understanding and generation tasks, demonstrating an average score of 63.70 on the Open LLM Leaderboard.

Loading preview...