nicolomonti/qwen3-1.7b-1bit-align-ce-sft
TEXT GENERATIONConcurrency Cost:1Model Size:2BQuant:BF16Ctx Length:32kPublished:Mar 27, 2026Architecture:Transformer Warm

The nicolomonti/qwen3-1.7b-1bit-align-ce-sft model is a 2 billion parameter Qwen3-based language model, fine-tuned using a merge-preserving 1-bit adapter. It was developed by nicolomonti with a focus on supervised fine-tuning using cross-entropy loss. This model is optimized for efficient deployment and performance, leveraging 1-bit quantization for reduced memory footprint and faster inference.

Loading preview...