MCult01/glm-muse-v5

TEXT GENERATIONConcurrency Cost:1Model Size:9BQuant:FP8Ctx Length:32kPublished:Apr 18, 2026License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

MCult01/glm-muse-v5 is a 9 billion parameter language model developed by MCult01, finetuned from MCult01/glm-muse-v4. This model was trained using Unsloth and Huggingface's TRL library, enabling faster training. It features a 32768 token context length, making it suitable for tasks requiring extensive contextual understanding.

Loading preview...

Model Overview

MCult01/glm-muse-v5 is a 9 billion parameter language model developed by MCult01, building upon its predecessor, MCult01/glm-muse-v4. This iteration benefits from an optimized training process, having been finetuned with Unsloth and Huggingface's TRL library, which facilitated a 2x faster training speed.

Key Characteristics

  • Developer: MCult01
  • Parameter Count: 9 billion
  • Context Length: 32768 tokens
  • Training Optimization: Utilizes Unsloth for accelerated finetuning.
  • License: Apache-2.0

Potential Use Cases

Given its substantial context window and optimized training, glm-muse-v5 is well-suited for applications requiring:

  • Processing and generating long-form content.
  • Tasks benefiting from extensive contextual understanding.
  • Scenarios where a 9B parameter model offers a balance between performance and computational efficiency.