Digsm003/model_sft_dare_fv
TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Mar 25, 2026Architecture:Transformer Cold

Digsm003/model_sft_dare_fv is a 1.5 billion parameter language model developed by Digsm003, featuring a substantial context length of 32768 tokens. This model is designed for general language understanding and generation tasks, offering a balance between size and capability. Its large context window makes it suitable for processing longer documents and maintaining coherence over extended conversations.

Loading preview...