ChuGyouk/R99
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:8kPublished:Mar 30, 2026Architecture:Transformer Cold

ChuGyouk/R99 is an 8 billion parameter causal language model, fine-tuned from ChuGyouk/Llama-3.1-8B. This model was trained using Supervised Fine-Tuning (SFT) with TRL, and supports a context length of 8192 tokens. It is designed for general text generation tasks, building upon the Llama 3.1 architecture.

Loading preview...