OpceanAI/Yuuki-RxG
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Mar 19, 2026License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

OpceanAI/Yuuki-RxG is an 8 billion parameter instruction-tuned causal language model developed by OpceanAI, based on the DeepSeek-R1-0528-Qwen3-8B architecture. It features a 32,768 token context length and is fine-tuned on the Yuuki-Personality dataset, making it suitable for chat and reasoning tasks. This bilingual model supports both English and Spanish, enhancing its utility for diverse conversational applications.

Loading preview...