MCult01/glm-muse-feral-v2

TEXT GENERATIONConcurrency Cost:1Model Size:9BQuant:FP8Ctx Length:32kPublished:Apr 23, 2026License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

MCult01/glm-muse-feral-v2 is a 9 billion parameter GLM-4 model developed by MCult01, finetuned from MCult01/glm-muse-feral. This model was trained using Unsloth and Huggingface's TRL library, enabling 2x faster finetuning. It is designed for general language tasks, leveraging its efficient training methodology.

Loading preview...

Overview

MCult01/glm-muse-feral-v2 is a 9 billion parameter language model, developed by MCult01. It is a finetuned version of the MCult01/glm-muse-feral model, built upon the GLM-4 architecture. A key characteristic of this model's development is its training efficiency, achieved by utilizing Unsloth and Huggingface's TRL library, which reportedly enabled a 2x speedup in the finetuning process.

Key Capabilities

  • Efficiently Finetuned: Benefits from accelerated training using Unsloth and TRL, suggesting potential for rapid adaptation to specific tasks.
  • GLM-4 Architecture: Inherits the foundational capabilities of the GLM-4 model family.

Good For

  • Developers seeking a GLM-4 based model that has undergone an optimized finetuning process.
  • Applications where efficient training methodologies are a point of interest or advantage.