Pam5/model_sft_dare
TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Mar 25, 2026Architecture:Transformer Warm

Pam5/model_sft_dare is a 1.5 billion parameter language model with a 32768 token context length. This model is a general-purpose language model, but specific details regarding its architecture, training, and primary differentiators are not provided in its current model card. Further information is needed to identify its specialized capabilities or optimal use cases.

Loading preview...