itsmepv/model_sft_dare
TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Mar 31, 2026Architecture:Transformer Cold
itsmepv/model_sft_dare is a 1.5 billion parameter instruction-tuned language model developed by itsmepv. This model is a general-purpose language model, but specific differentiators or primary use cases are not detailed in the provided information. It has a context length of 32768 tokens, allowing for processing of extensive inputs.
Loading preview...