abacaj/phi-2-super
abacaj/phi-2-super is a 3 billion parameter instruction-tuned causal language model based on Microsoft's Phi-2 architecture, further fine-tuned using Supervised Fine-Tuning (SFT) and Conditional Direct Preference Optimization (cDPO). This model is designed for general-purpose conversational AI, demonstrating improved performance on benchmarks like MT-bench and heval compared to its base model. It is suitable for applications requiring a compact yet capable language model for chat and instruction-following tasks.
Loading preview...
Overview
abacaj/phi-2-super is an instruction-tuned language model built upon the Microsoft Phi-2 base model, featuring 3 billion parameters. It has undergone further training using Supervised Fine-Tuning (SFT) and Conditional Direct Preference Optimization (cDPO) to enhance its conversational and instruction-following capabilities.
Key Capabilities
- Instruction Following: Optimized for understanding and responding to user instructions.
- Chat Template Adherence: Utilizes the Mistral instruct chat template, ensuring consistent and effective conversational interactions.
- Improved Performance: Demonstrates enhanced results on evaluation benchmarks such as MT-bench and heval, indicating better reasoning and general language understanding compared to the original Phi-2.
Good For
- General Conversational AI: Ideal for chatbots and virtual assistants requiring a compact model.
- Instruction-Based Tasks: Suitable for applications where the model needs to follow specific commands or answer questions directly.
- Resource-Constrained Environments: Its 3B parameter size makes it a viable option for deployment in scenarios with limited computational resources, while still offering strong performance.