unsloth/Phi-3.5-mini-instruct
TEXT GENERATIONConcurrency Cost:1Model Size:4BQuant:BF16Ctx Length:4kPublished:Aug 20, 2024License:mitArchitecture:Transformer0.0K Open Weights Cold

The unsloth/Phi-3.5-mini-instruct is a 3.8 billion parameter instruction-tuned decoder-only Transformer model developed by Microsoft AI and the Phi team. It supports a 128K token context length and is optimized for strong reasoning capabilities, particularly in code, math, and logic. This model excels in memory/compute constrained environments and latency-bound scenarios, offering competitive multilingual performance.

Loading preview...