alibayram/magibu-26b-merged

VISIONConcurrency Cost:2Model Size:27BQuant:FP8Ctx Length:32kPublished:Feb 20, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

The alibayram/magibu-26b-merged is a 27 billion parameter language model developed by alibayram, finetuned from alibayram/gemma3-27b-multi-turn. This model was trained using Unsloth and Huggingface's TRL library, achieving 2x faster training speeds. It is designed for general language generation tasks, leveraging its large parameter count and efficient training methodology.

Loading preview...

Model Overview

alibayram/magibu-26b-merged is a 27 billion parameter language model developed by alibayram. It is a finetuned version of the alibayram/gemma3-27b-multi-turn base model, indicating its foundation in the Gemma3 architecture.

Key Characteristics

  • Parameter Count: This model features 27 billion parameters, placing it in the large-scale LLM category.
  • Training Efficiency: A notable aspect of this model is its training process, which utilized Unsloth and Huggingface's TRL library. This combination enabled the model to be trained approximately 2 times faster than conventional methods.
  • License: The model is released under the Apache-2.0 license, allowing for broad use and distribution.

Potential Use Cases

Given its large parameter count and efficient training, alibayram/magibu-26b-merged is suitable for a variety of natural language processing tasks, particularly those benefiting from a robust and well-trained base model. Its finetuned nature suggests potential for improved performance in conversational or multi-turn interaction scenarios, building upon its gemma3-27b-multi-turn origin.