MCult01/glm-muse-feral-v3

TEXT GENERATIONConcurrency Cost:1Model Size:9BQuant:FP8Ctx Length:32kPublished:Apr 25, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

MCult01/glm-muse-feral-v3 is a 9 billion parameter language model developed by MCult01, finetuned from glm-muse-feral-v2. This model was trained using Unsloth and Huggingface's TRL library, enabling 2x faster training. It features a 32768 token context length and is licensed under Apache-2.0, making it suitable for applications requiring efficient and performant language processing.

Loading preview...

Model Overview

MCult01/glm-muse-feral-v3 is a 9 billion parameter language model developed by MCult01, building upon its predecessor, glm-muse-feral-v2. This iteration benefits from a highly optimized training process, utilizing Unsloth and Huggingface's TRL library, which facilitated a 2x acceleration in training speed.

Key Characteristics

  • Parameter Count: 9 billion parameters, offering a balance between performance and computational efficiency.
  • Context Length: Supports a substantial context window of 32768 tokens, allowing for processing longer inputs and generating more coherent, extended outputs.
  • Training Efficiency: Leverages advanced training techniques from Unsloth and Huggingface TRL for faster development cycles.
  • License: Distributed under the Apache-2.0 license, providing broad usage rights for developers and researchers.

Potential Use Cases

This model is well-suited for applications where efficient training and a large context window are beneficial. Its optimized development process suggests it could be a strong candidate for projects requiring rapid iteration and deployment of language models.