AgnivaSaha/model_sft_dare_resta
TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Mar 22, 2026Architecture:Transformer Warm

AgnivaSaha/model_sft_dare_resta is a 1.5 billion parameter language model with a 32768 token context length. This model is a Hugging Face Transformers model, automatically pushed to the Hub. Further details regarding its architecture, training data, specific capabilities, and intended use cases are currently marked as "More Information Needed" in its model card.

Loading preview...