Sandeep0079/model_sft_dare
TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Apr 5, 2026Architecture:Transformer Loading

The Sandeep0079/model_sft_dare is a 1.5 billion parameter language model with a 32768 token context length. This model is a fine-tuned transformer, though specific architectural details and training data are not provided in its current model card. Its primary characteristics and intended use cases are not explicitly detailed, suggesting it may be a base model or an early stage fine-tune.

Loading preview...