beyoru/Tama-JP-beta
TEXT GENERATIONConcurrency Cost:1Model Size:4BQuant:BF16Ctx Length:32kPublished:Sep 30, 2025License:mitArchitecture:Transformer0.0K Open Weights Warm

Tama-JP-beta is a 4 billion parameter conversational AI model, continuously pretrained on Qwen3-4B, developed by Beyoru. It is specifically optimized for immersive NSFW roleplay and natural chatting, distinguishing itself through its capability to generate explicit content. With a notable context length of 40960 tokens, this model is designed for use cases requiring detailed and extended roleplay-style conversations.

Loading preview...