Pickamon/CogniTune-Qwen2.5-3B
TEXT GENERATIONConcurrency Cost:1Model Size:3.1BQuant:BF16Ctx Length:32kPublished:Mar 26, 2026License:otherArchitecture:Transformer0.0K Warm

Pickamon/CogniTune-Qwen2.5-3B is a 3.1 billion parameter domain-specialized AI/ML tutor model, fine-tuned from Qwen2.5-3B-Instruct. Developed by Irtiza Saleem, it excels at explaining AI/ML concepts with analogies and correcting misconceptions, rather than providing encyclopedic responses. This model is optimized for delivering clear, tutor-like explanations in the AI/ML domain, leveraging its 32768 token context length.

Loading preview...