MCult01/glm-muse-v1
TEXT GENERATIONConcurrency Cost:1Model Size:9BQuant:FP8Ctx Length:32kPublished:Apr 9, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

MCult01/glm-muse-v1 is a 9 billion parameter language model developed by MCult01, fine-tuned from THUDM/GLM-4-9B-0414. This model was trained using Unsloth and Huggingface's TRL library, achieving a 2x speed improvement during the fine-tuning process. With a 32768 token context length, it is optimized for efficient processing and generation tasks.

Loading preview...