UCSC-VLAA/STAR1-R1-Distill-1.5B
TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Apr 3, 2025License:apache-2.0Architecture:Transformer Open Weights Warm
UCSC-VLAA/STAR1-R1-Distill-1.5B is a 1.5 billion parameter language model developed by UCSC-VLAA, fine-tuned on the STAR-1 dataset to enhance safety alignment in large reasoning models. This model is specifically designed to improve safety practices with minimal impact on reasoning capabilities. It integrates and refines data from multiple sources, providing policy-grounded reasoning samples for safer AI applications.
Loading preview...