ChuGyouk/F_R8
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:8kPublished:Mar 30, 2026Architecture:Transformer Cold

ChuGyouk/F_R8 is an 8 billion parameter instruction-tuned causal language model developed by ChuGyouk, fine-tuned from ChuGyouk/Llama-3.1-8B. Trained using TRL, this model is designed for general text generation tasks, particularly excelling in conversational question-answering scenarios. It processes inputs up to an 8192 token context length, making it suitable for various natural language understanding and generation applications.

Loading preview...