reedmayhew/littlemonster-reasoning-v2-12B-QVO-HF
VISIONConcurrency Cost:1Model Size:12BQuant:FP8Ctx Length:32kPublished:Mar 5, 2026License:apache-2.0Architecture:Transformer Open Weights Cold
The reedmayhew/littlemonster-reasoning-v2-12B-QVO-HF is a 12 billion parameter Gemma3 model developed by reedmayhew, fine-tuned for reasoning tasks. This model was trained using Unsloth and Huggingface's TRL library, achieving 2x faster training. It is designed for applications requiring advanced reasoning capabilities within a 32768 token context window.
Loading preview...