alibayram/magibu-26b-merged
VISIONConcurrency Cost:2Model Size:27BQuant:FP8Ctx Length:32kPublished:Feb 20, 2026License:apache-2.0Architecture:Transformer Open Weights Cold
The alibayram/magibu-26b-merged is a 27 billion parameter language model developed by alibayram, finetuned from alibayram/gemma3-27b-multi-turn. This model was trained using Unsloth and Huggingface's TRL library, achieving 2x faster training speeds. It is designed for general language generation tasks, leveraging its large parameter count and efficient training methodology.
Loading preview...