RaihanGG2026/gemma2-9b-easyBEN-merged
TEXT GENERATIONConcurrency Cost:1Model Size:9BQuant:FP8Ctx Length:16kPublished:Apr 8, 2026License:apache-2.0Architecture:Transformer0.0K Open Weights Cold
RaihanGG2026/gemma2-9b-easyBEN-merged is a 9 billion parameter Gemma2 model developed by RaihanGG2026. This model was finetuned from unsloth/gemma-2-9b-it-bnb-4bit using Unsloth and Huggingface's TRL library, enabling 2x faster training. It features a 16384 token context length and is optimized for efficient deployment and performance due to its accelerated training methodology.
Loading preview...