sohammandal01/model_sft_dare_0.3
TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Apr 5, 2026Architecture:Transformer Loading

The sohammandal01/model_sft_dare_0.3 is a 1.5 billion parameter language model with a 32768 token context length. Developed by sohammandal01, this model is a fine-tuned transformer designed for general language understanding and generation tasks. Its architecture and parameter count suggest suitability for applications requiring efficient processing of long contexts.

Loading preview...