3tic/Orion-Qwen3-1.7B-CPT-v2603
TEXT GENERATIONConcurrency Cost:1Model Size:2BQuant:BF16Ctx Length:32kPublished:Mar 25, 2026License:apache-2.0Architecture:Transformer0.0K Open Weights Warm

The 3tic/Orion-Qwen3-1.7B-CPT-v2603 is a 2 billion parameter causal language model based on Qwen3-1.7B-Base, developed by 3tic. It has been continuously pre-trained (CPT) on over 20 billion tokens of Chinese and Japanese light novel data. This model is specifically optimized for tasks requiring understanding and generation within the context of light novel narratives, offering a 32768 token context length.

Loading preview...