Brigham-Young-University/Qwen2.5-Coder-3B-Ilograph-Instruct
TEXT GENERATIONConcurrency Cost:1Model Size:3.1BQuant:BF16Ctx Length:32kPublished:Mar 5, 2026License:mitArchitecture:Transformer Open Weights Warm

The Brigham-Young-University/Qwen2.5-Coder-3B-Ilograph-Instruct is a 3.1 billion parameter causal language model, fine-tuned from Qwen2.5-Coder-3B-Instruct. Developed by Chris Mijangos at BYU, this model specializes in generating Ilograph Diagram Language (IDL) specifications from natural language instructions. It is optimized for creating diagrams centered on resources, relationships, and sequences, utilizing a 32768-token context length.

Loading preview...