abhinavakarsh0033/model_sft_dare_resta
TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Mar 27, 2026Architecture:Transformer Warm
The abhinavakarsh0033/model_sft_dare_resta is a 1.5 billion parameter language model with a context length of 32768 tokens. This model is a general-purpose transformer-based model, though specific architecture and training details are not provided. Its primary use case and differentiating features are currently unspecified, as the model card indicates "More Information Needed" for most sections.
Loading preview...