abacaj/phi-2-super
TEXT GENERATIONConcurrency Cost:1Model Size:3BQuant:BF16Ctx Length:2kPublished:Mar 1, 2024License:mitArchitecture:Transformer0.1K Open Weights Cold

abacaj/phi-2-super is a 3 billion parameter instruction-tuned causal language model based on Microsoft's Phi-2 architecture, further fine-tuned using Supervised Fine-Tuning (SFT) and Conditional Direct Preference Optimization (cDPO). This model is designed for general-purpose conversational AI, demonstrating improved performance on benchmarks like MT-bench and heval compared to its base model. It is suitable for applications requiring a compact yet capable language model for chat and instruction-following tasks.

Loading preview...