ChuGyouk/35
TEXT GENERATIONConcurrency Cost:1Model Size:4BQuant:BF16Ctx Length:32kPublished:Jan 20, 2026Architecture:Transformer Cold

ChuGyouk/35 is a 4 billion parameter language model, fine-tuned from ChuGyouk/Qwen3-4B-Base-AGUINAS-0p5k. This model was specifically trained using TRL on the ChuGyouk/0120FINAL-SemEval24Task5 dataset, making it suitable for tasks related to the SemEval 2024 Task 5 challenge. It is designed for text generation and understanding within its specialized domain.

Loading preview...

Overview

ChuGyouk/35 is a 4 billion parameter language model developed by ChuGyouk. It is a fine-tuned version of the ChuGyouk/Qwen3-4B-Base-AGUINAS-0p5k base model, specifically adapted through Supervised Fine-Tuning (SFT) using the TRL framework. The training utilized the ChuGyouk/0120FINAL-SemEval24Task5 dataset, indicating its specialization for tasks related to the SemEval 2024 Task 5.

Key Capabilities

  • Specialized Text Generation: Fine-tuned on a specific dataset, suggesting proficiency in generating text relevant to the SemEval 2024 Task 5 domain.
  • TRL Framework: Developed using the TRL library, which is often associated with reinforcement learning from human feedback (RLHF) or similar fine-tuning approaches, though this model specifically used SFT.

Training Details

The model was trained with the following framework versions:

  • TRL: 0.24.0
  • Transformers: 4.57.3
  • Pytorch: 2.9.1
  • Datasets: 4.3.0
  • Tokenizers: 0.22.1

Good For

  • Researchers and developers working on tasks related to the SemEval 2024 Task 5.
  • Applications requiring text generation or understanding within the specific domain covered by the ChuGyouk/0120FINAL-SemEval24Task5 dataset.