sohammandal01/model_sft_dare_resta_0.7
TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Apr 5, 2026Architecture:Transformer Cold

The sohammandal01/model_sft_dare_resta_0.7 is a 1.5 billion parameter language model with a 32768 token context length. Developed by sohammandal01, this model is a fine-tuned variant, though specific architectural details and its primary differentiators are not explicitly provided in its current documentation. Its intended applications and unique strengths are currently unspecified, requiring further information for precise use case identification.

Loading preview...