sihab/slm-1.0
TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Nov 10, 2025License:apache-2.0Architecture:Transformer0.0K Open Weights Warm

SLM 1.0 is a 1.5 billion parameter specialized causal language model developed by NeuroBrain, featuring a 32,768 token context length. It is specifically optimized for structured output generation, strict JSON schema compliance, and effective tool calling capabilities. This model excels at producing precisely formatted responses for applications requiring reliable structured data.

Loading preview...