MaziyarPanahi/calme-2.3-phi3-4b
TEXT GENERATIONConcurrency Cost:1Model Size:4BQuant:BF16Ctx Length:4kPublished:May 10, 2024License:mitArchitecture:Transformer0.0K Open Weights Cold
MaziyarPanahi/calme-2.3-phi3-4b is a 4 billion parameter language model, fine-tuned (DPO) from Microsoft's Phi-3-mini-4k-instruct. This model, developed by MaziyarPanahi, is notable for being the best-performing Phi-3-mini-4k model on the Open LLM Leaderboard as of March 2024. It features a 4096-token context length and is optimized for general instruction-following tasks, demonstrating strong performance across various benchmarks including MMLU and HellaSwag.
Loading preview...