choiqs/Qwen3-1.7B-tldr-bsz128-ts300-regular-qrm-skywork8b-seed42-lr1e-6-warmup10-checkpoint100
TEXT GENERATIONConcurrency Cost:1Model Size:2BQuant:BF16Ctx Length:32kPublished:Apr 9, 2026Architecture:Transformer Loading
The choiqs/Qwen3-1.7B-tldr-bsz128-ts300-regular-qrm-skywork8b-seed42-lr1e-6-warmup10-checkpoint100 is a 1.7 billion parameter language model based on the Qwen3 architecture. This model is fine-tuned with a context length of 32768 tokens. Specific details regarding its primary differentiators, training data, and intended use cases are not provided in the available model card.
Loading preview...