ChuGyouk/F_R15_1
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Mar 28, 2026Architecture:Transformer Cold

ChuGyouk/F_R15_1 is an 8 billion parameter language model fine-tuned by ChuGyouk from the Qwen3-8B-Base architecture. This model was trained using Supervised Fine-Tuning (SFT) with the TRL framework, making it suitable for general text generation tasks. With a context length of 32768 tokens, it offers robust performance for conversational AI and question-answering applications.

Loading preview...