alibayram/magibu-11b
VISIONConcurrency Cost:1Model Size:12BQuant:FP8Ctx Length:32kPublished:Feb 14, 2026Architecture:Transformer Cold

alibayram/magibu-11b is a fine-tuned language model developed by alibayram, based on an unspecified 11 billion parameter architecture. This model was trained using the TRL framework, focusing on instruction following through Supervised Fine-Tuning (SFT). It is designed for general text generation tasks, particularly responding to prompts and questions in a conversational style.

Loading preview...