Abdullah-Taha/UTN-Qwen3-0.6B-LoRA-merged
TEXT GENERATIONConcurrency Cost:1Model Size:0.8BQuant:BF16Ctx Length:32kPublished:Apr 7, 2026License:mitArchitecture:Transformer Open Weights Warm

The Abdullah-Taha/UTN-Qwen3-0.6B-LoRA-merged model is a 0.8 billion parameter language model based on the Qwen3 architecture, specifically fine-tuned using LoRA on domain-specific data from the University of Technology Nuremberg (UTN). This model is optimized for providing helpful assistance and answering questions related to the UTN, making it suitable for specialized informational retrieval tasks. It is designed for direct inference without requiring PEFT libraries, offering a streamlined deployment for UTN-specific applications.

Loading preview...