limegreenpeper1/Qwen3-4B-Novel-JP
TEXT GENERATIONConcurrency Cost:1Model Size:4BQuant:BF16Ctx Length:32kPublished:Mar 29, 2026License:apache-2.0Architecture:Transformer Open Weights Warm
limegreenpeper1/Qwen3-4B-Novel-JP is a 4 billion parameter Qwen3 model developed by limegreenpeper1, fine-tuned for Japanese novel generation. This model was optimized for faster training using Unsloth and Huggingface's TRL library. It is designed to excel in creative text generation tasks, particularly within the Japanese language context. With a 32768 token context length, it supports generating longer, coherent narratives.
Loading preview...