ChuGyouk/F_R19

TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Mar 28, 2026Architecture:Transformer Cold

ChuGyouk/F_R19 is an 8 billion parameter causal language model, fine-tuned from ChuGyouk/Qwen3-8B-Base. This model is optimized for general text generation tasks, leveraging its base architecture and SFT training for improved conversational and creative text outputs. With a 32768 token context length, it is suitable for applications requiring extended conversational memory or processing longer documents.

Loading preview...

Model Overview

ChuGyouk/F_R19 is an 8 billion parameter language model, fine-tuned from the ChuGyouk/Qwen3-8B-Base architecture. This model has undergone Supervised Fine-Tuning (SFT) using the TRL framework, aiming to enhance its capabilities in generating coherent and contextually relevant text.

Key Capabilities

  • General Text Generation: Excels at producing human-like text based on given prompts.
  • Extended Context Handling: Supports a substantial context window of 32768 tokens, allowing for more detailed and longer interactions or document processing.
  • Instruction Following: Benefits from its fine-tuning process to better understand and respond to user instructions.

Training Details

The model was trained using the TRL (Transformer Reinforcement Learning) library, specifically employing Supervised Fine-Tuning (SFT). The training environment utilized Transformers version 5.2.0, PyTorch 2.10.0, and Datasets 4.3.0, ensuring a robust and modern training pipeline. Further details on the training run can be found on Weights & Biases.