distil-labs/distil-qwen3-0.6b-text2sql
TEXT GENERATIONConcurrency Cost:1Model Size:0.8BQuant:BF16Ctx Length:32kPublished:Jan 14, 2026License:apache-2.0Architecture:Transformer0.0K Open Weights Warm

The distil-labs/distil-qwen3-0.6b-text2sql is a 0.6 billion parameter Qwen3-based model developed by Distil Labs, fine-tuned for converting natural language questions into SQL queries. Utilizing knowledge distillation from DeepSeek-V3, this compact model achieves strong Text2SQL performance, reaching 74% LLM-as-a-Judge accuracy, a 2x improvement over its base model. With a 40,960 token context length, it is optimized for lightweight and fast local deployment for SQL generation tasks.

Loading preview...