psychopenguin/indian_legal_Phi-3-mini-4k-instruct
TEXT GENERATIONConcurrency Cost:1Model Size:4BQuant:BF16Ctx Length:4kPublished:Mar 23, 2026License:mitArchitecture:Transformer Open Weights Cold

The psychopenguin/indian_legal_Phi-3-mini-4k-instruct is a 3.8 billion parameter, instruction-tuned decoder-only Transformer model developed by Microsoft, based on the Phi-3 family. It is optimized for strong reasoning, particularly in math and logic, and supports a 4096-token context length. This model is designed for general-purpose AI systems and applications requiring efficient performance in memory/compute-constrained or latency-bound environments.

Loading preview...