reaperdoesntknow/Disctil-Qwen3-1.7B
TEXT GENERATIONConcurrency Cost:1Model Size:2BQuant:BF16Ctx Length:32kPublished:Mar 28, 2026Architecture:Transformer Warm

reaperdoesntknow/Disctil-Qwen3-1.7B is a 1.7 billion parameter language model developed by Convergent Intelligence LLC: Research Division. It is a fine-tuned version of DiStil-Qwen3-1.7B-uncensored, part of the DistilQwen collection, which focuses on proof-weighted distillation from Qwen3-30B-A3B. This model is specifically refined using Discrepancy Calculus (DISC), a measure-theoretic framework that quantifies mismatch between integration and differentiation, preserving structural boundaries in the model's weight space. It is optimized for tasks where capturing nuanced structural information is critical, offering a distinct approach to model refinement.

Loading preview...