ChuGyouk/F_R18
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Mar 28, 2026Architecture:Transformer Cold

ChuGyouk/F_R18 is an 8 billion parameter language model, fine-tuned from ChuGyouk/Qwen3-8B-Base using TRL. This model is designed for general text generation tasks, leveraging its 32768 token context length for processing longer inputs. Its training methodology focuses on supervised fine-tuning, making it suitable for conversational AI and question-answering applications.

Loading preview...