Amu/spin-phi2
TEXT GENERATIONConcurrency Cost:1Model Size:3BQuant:BF16Ctx Length:2kPublished:Feb 22, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Amu/spin-phi2 is a 3 billion parameter language model fine-tuned from Microsoft's Phi-2 using the Self-Play finetuning (SPIN) method. This model leverages the ultrachat_200k dataset to enhance its conversational capabilities, demonstrating improved performance over the original pretrained Phi-2 on the Open LLM Leaderboard. It is optimized for general conversational tasks and reasoning, making it suitable for applications requiring robust language understanding and generation.

Loading preview...