BraahMohamed1/Qwen3-8B-MyLoRA
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Feb 7, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

Qwen3-8B-MyLoRA is an 8.2 billion parameter causal language model from the Qwen series, developed by Qwen. It features a unique ability to seamlessly switch between a 'thinking mode' for complex reasoning tasks like math and coding, and a 'non-thinking mode' for efficient general-purpose dialogue. This model excels in reasoning, instruction-following, agent capabilities, and multilingual support, making it suitable for diverse applications requiring adaptable conversational AI.

Loading preview...