ferrazzipietro/crfTask-unsup-Qwen3-1.7B-datav3-all-merged
TEXT GENERATIONConcurrency Cost:1Model Size:2BQuant:BF16Ctx Length:32kPublished:Feb 23, 2026Architecture:Transformer Warm

The ferrazzipietro/crfTask-unsup-Qwen3-1.7B-datav3-all-merged model is a 2 billion parameter language model based on the Qwen3 architecture, with a context length of 32768 tokens. This model is a merged version, indicating a combination of different training stages or datasets. While specific differentiators are not detailed in the provided information, its architecture and parameter count suggest it is suitable for general language understanding and generation tasks.

Loading preview...