sohammandal01/model_sft_dare_0.5
TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Apr 5, 2026Architecture:Transformer Cold

The sohammandal01/model_sft_dare_0.5 is a 1.5 billion parameter language model developed by sohammandal01, featuring a context length of 32768 tokens. This model is a fine-tuned variant, though specific details on its architecture or training data are not provided in the available documentation. Its primary differentiators and optimized use cases are not explicitly stated, suggesting it may be a general-purpose model or require further evaluation for specific applications.

Loading preview...