distil-labs/distil-qwen3-4b-text2sql
TEXT GENERATIONConcurrency Cost:1Model Size:4BQuant:BF16Ctx Length:32kPublished:Jan 11, 2026License:apache-2.0Architecture:Transformer0.0K Open Weights Warm

Distil-Qwen3-4B-Text2SQL by Distil Labs is a 4 billion parameter Qwen3-based model fine-tuned for converting natural language questions into SQL queries. Utilizing knowledge distillation from DeepSeek-V3, it achieves teacher-level accuracy in Text2SQL tasks while maintaining a compact size suitable for local deployment. With a context length of 262,144 tokens, this model excels at generating SQLite-compatible SQL queries from schemas with 1-2 tables, making it ideal for database chatbots and SQL assistance.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p