DrRiceIO7/Gemma3-4B-CoT
VISIONConcurrency Cost:1Model Size:4.3BQuant:BF16Ctx Length:32kPublished:Dec 28, 2025License:apache-2.0Architecture:Transformer0.0K Open Weights Cold
DrRiceIO7/Gemma3-4B-CoT is a 4.3 billion parameter language model developed by DrRiceIO7, finetuned from unsloth/gemma-3-4b-pt-unsloth-bnb-4bit. This model was trained using Unsloth and Huggingface's TRL library, achieving 2x faster training. Licensed under Apache-2.0, its specific primary use case or optimization is not detailed, but it is noted to be for GRPO.
Loading preview...