MaziyarPanahi/phi-2-logical-sft
TEXT GENERATIONConcurrency Cost:1Model Size:3BQuant:BF16Ctx Length:2kPublished:Feb 24, 2024License:mitArchitecture:Transformer0.0K Open Weights Cold

MaziyarPanahi/phi-2-logical-sft is a 3 billion parameter causal language model fine-tuned by MaziyarPanahi from Microsoft's phi-2 architecture. This model is specifically fine-tuned on the Open-Platypus dataset, enhancing its logical reasoning and instruction-following capabilities. It achieves an average score of 61.50 on the Open LLM Leaderboard, making it suitable for tasks requiring structured responses and logical problem-solving within its 2048 token context length.

Loading preview...