ChuGyouk/F_R5_1
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Mar 27, 2026Architecture:Transformer Cold

ChuGyouk/F_R5_1 is an 8 billion parameter instruction-tuned causal language model developed by ChuGyouk, fine-tuned from ChuGyouk/Qwen3-8B-Base. This model was trained using the TRL library and is optimized for general text generation tasks with a 32768 token context length. It is suitable for conversational AI and question-answering applications.

Loading preview...