hmuegyi/my-en-translator-backup

TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Feb 25, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

The hmuegyi/my-en-translator-backup is a 7.6 billion parameter Qwen2-based causal language model developed by hmuegyi. This model was finetuned using Unsloth and Huggingface's TRL library, enabling faster training. It is designed for general language tasks, leveraging its Qwen2 architecture for robust performance.

Loading preview...

Model Overview

The hmuegyi/my-en-translator-backup is a 7.6 billion parameter language model based on the Qwen2 architecture. Developed by hmuegyi, this model was finetuned from unsloth/qwen2.5-7b-bnb-4bit.

Key Characteristics

  • Architecture: Qwen2-based causal language model.
  • Parameter Count: 7.6 billion parameters.
  • Training Efficiency: Finetuned using Unsloth and Huggingface's TRL library, which facilitated a 2x faster training process.
  • License: Distributed under the Apache-2.0 license.

Use Cases

This model is suitable for various natural language processing tasks, benefiting from its Qwen2 foundation and efficient finetuning. Its design makes it a capable option for applications requiring a robust language model with a moderate parameter count.