choiqs/Qwen3-1.7B-tldr-bsz128-ts300-regular-qrm-skywork8b-seed42-lr1e-6-warmup10-checkpoint125
TEXT GENERATIONConcurrency Cost:1Model Size:2BQuant:BF16Ctx Length:32kPublished:Apr 9, 2026Architecture:Transformer Loading

The choiqs/Qwen3-1.7B-tldr-bsz128-ts300-regular-qrm-skywork8b-seed42-lr1e-6-warmup10-checkpoint125 is a 2 billion parameter language model developed by choiqs, based on the Qwen3 architecture. This model is fine-tuned with a context length of 32768 tokens, indicating a focus on processing extensive inputs. Its specific training configuration suggests optimization for tasks requiring substantial context understanding, potentially for summarization or long-form content generation.

Loading preview...