sohammandal01/model_sft_dare_resta_0.3
TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Apr 5, 2026Architecture:Transformer Loading
The sohammandal01/model_sft_dare_resta_0.3 is a 1.5 billion parameter language model with a 32768 token context length. This model is a fine-tuned transformer, though specific architectural details and its primary differentiators are not explicitly provided in its current documentation. It is intended for general language generation tasks, but its specific strengths or optimizations are not detailed.
Loading preview...