bakulgrosirherbal/Qwen3-1.7B-Gemini-2.5-Flash-Lite-Preview-Distill
TEXT GENERATIONConcurrency Cost:1Model Size:2BQuant:BF16Ctx Length:32kPublished:Mar 16, 2026License:apache-2.0Architecture:Transformer0.0K Open Weights Warm

bakulgrosirherbal/Qwen3-1.7B-Gemini-2.5-Flash-Lite-Preview-Distill is a 2 billion parameter Qwen3-based language model developed by TeichAI. It was fine-tuned from unsloth/Qwen3-1.7B-unsloth-bnb-4bit using 1000 examples from Gemini 2.5 Flash Lite Preview 09-2025. This model leverages Unsloth and Huggingface's TRL library for accelerated training, making it suitable for tasks requiring efficient fine-tuning on specific datasets.

Loading preview...