distil-labs/distil-email-classifier
TEXT GENERATIONConcurrency Cost:1Model Size:0.8BQuant:BF16Ctx Length:32kPublished:Jan 6, 2026License:apache-2.0Architecture:Transformer0.0K Open Weights Warm

The distil-labs/distil-email-classifier is a 0.8 billion parameter Qwen3-based model developed by Distil Labs, fine-tuned for local email classification. Utilizing knowledge distillation and supervised fine-tuning, it achieves 93% accuracy on a 10-way email classification task. This model is specifically designed for integration with n8n to enable fully local, privacy-preserving email auto-labeling without sending content to cloud LLMs.

Loading preview...