sohammandal01/model_sft_dare_0.1
TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Apr 5, 2026Architecture:Transformer Loading

The sohammandal01/model_sft_dare_0.1 is a 1.5 billion parameter language model with a 32768 token context length. This model is a fine-tuned variant, though specific architectural details and training data are not provided in its current documentation. Its primary differentiators and specific use cases are not detailed, suggesting it may be a foundational or general-purpose model awaiting further specialization or documentation.

Loading preview...