choiqs/Qwen3-1.7B-tldr-bsz128-ts300-regular-qrm-skywork8b-seed42-lr1e-6-warmup10-checkpoint150
TEXT GENERATIONConcurrency Cost:1Model Size:2BQuant:BF16Ctx Length:32kPublished:Apr 9, 2026Architecture:Transformer Loading

The choiqs/Qwen3-1.7B-tldr-bsz128-ts300-regular-qrm-skywork8b-seed42-lr1e-6-warmup10-checkpoint150 is a 2 billion parameter language model based on the Qwen3 architecture, featuring a substantial 32,768 token context length. This model is a fine-tuned variant, indicated by its specific naming convention, suggesting optimization for particular tasks or performance characteristics. Its large context window makes it suitable for applications requiring extensive textual understanding and generation.

Loading preview...