Digsm003/model_sft_dare
TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Mar 12, 2026Architecture:Transformer Cold

The Digsm003/model_sft_dare is a 1.5 billion parameter language model with a 32768 token context length. This model is a fine-tuned transformer, though specific architectural details and its primary developer are not provided in the available documentation. Its intended use cases and unique differentiators are not specified, as the model card indicates "More Information Needed" across most sections.

Loading preview...