sohammandal01/model_sft_dare_0.7
TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Apr 5, 2026Architecture:Transformer Loading

The sohammandal01/model_sft_dare_0.7 is a 1.5 billion parameter language model with a 32768 token context length. This model is a fine-tuned transformer, though specific architectural details and its primary differentiators are not provided in the available documentation. Its main use case and specific strengths are currently undefined, as the model card indicates 'More Information Needed' across most technical and application-specific sections.

Loading preview...