MCult01/glm-muse-feral-v5

TEXT GENERATIONConcurrency Cost:1Model Size:9BQuant:FP8Ctx Length:32kPublished:Apr 25, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

MCult01/glm-muse-feral-v5 is a 9 billion parameter language model developed by MCult01, finetuned from MCult01/glm-muse-feral-v4. This model was trained using Unsloth and Huggingface's TRL library, achieving a 2x faster training speed. It is designed for general language tasks, leveraging its efficient training methodology.

Loading preview...

Model Overview

MCult01/glm-muse-feral-v5 is a 9 billion parameter language model developed by MCult01. This iteration is a finetuned version of the MCult01/glm-muse-feral-v4 model, built upon the GLM4 architecture.

Key Characteristics

  • Efficient Training: The model was trained with Unsloth and Huggingface's TRL library, resulting in a 2x faster training process compared to standard methods.
  • Parameter Count: Features 9 billion parameters, offering a balance between performance and computational efficiency.
  • Context Length: Supports a context length of 32768 tokens, suitable for handling moderately long inputs.

Use Cases

This model is suitable for a variety of general language understanding and generation tasks where efficient training and a reasonable parameter count are beneficial. Its finetuned nature suggests potential for improved performance on specific domains or styles, depending on the finetuning data.