Digsm003/model_sft_dare_resta
TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Mar 13, 2026Architecture:Transformer Loading

Digsm003/model_sft_dare_resta is a 1.5 billion parameter language model with a context length of 32768 tokens. Developed by Digsm003, this model is a fine-tuned variant, though specific details on its architecture, training, and primary differentiators are not provided in its current documentation. Its intended use cases and unique strengths are currently unspecified.

Loading preview...