sohammandal01/model_sft_dare_resta_0.1
TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Apr 5, 2026Architecture:Transformer Loading

The sohammandal01/model_sft_dare_resta_0.1 is a 1.5 billion parameter language model with a 32768 token context length. This model is a Hugging Face Transformers model, automatically pushed to the Hub. Due to limited information in its model card, specific differentiators or primary use cases beyond being a general language model cannot be identified.

Loading preview...