wassemgtk/chuck-norris-llm
TEXT GENERATIONConcurrency Cost:2Model Size:32BQuant:FP8Ctx Length:32kPublished:Mar 21, 2026License:apache-2.0Architecture:Transformer0.0K Open Weights Cold
wassemgtk/chuck-norris-llm is a 32 billion parameter causal language model, fine-tuned from Qwen3 32B, specializing in reasoning, math, and code generation. It utilizes Supervised Fine-Tuning (SFT) with a focus on chain-of-thought reasoning, enabling it to "think before it speaks" and show its work. This model is primarily designed for complex logical tasks, code debugging, and general-purpose chat with enhanced reasoning capabilities, distinguishing itself through its unique personality and problem-solving approach.
Loading preview...