3tic/Orion-Qwen3-1.7B-CPT-v2603

Hugging Face
TEXT GENERATIONConcurrency Cost:1Model Size:2BQuant:BF16Ctx Length:32kPublished:Mar 25, 2026License:apache-2.0Architecture:Transformer0.0K Open Weights Warm

The 3tic/Orion-Qwen3-1.7B-CPT-v2603 is a 2 billion parameter causal language model based on Qwen3-1.7B-Base, developed by 3tic. It has been continuously pre-trained (CPT) on over 20 billion tokens of Chinese and Japanese light novel data. This model is specifically optimized for tasks requiring understanding and generation within the context of light novel narratives, offering a 32768 token context length.

Loading preview...

Model Overview

The 3tic/Orion-Qwen3-1.7B-CPT-v2603 is a 2 billion parameter base model derived from the Qwen3-1.7B-Base architecture. Developed by 3tic, this model has undergone extensive continuous pre-training (CPT) on a specialized dataset exceeding 20 billion tokens of Chinese and Japanese light novel content. It features a substantial context window of 32768 tokens.

Key Capabilities

  • Specialized Domain Knowledge: Optimized for processing and generating text related to Chinese and Japanese light novels.
  • Large Context Window: Supports a 32768 token context length, beneficial for understanding long-form narratives.
  • Foundation Model: Serves as a robust base model for further fine-tuning on specific downstream tasks within its specialized domain.

Ideal Use Cases

  • Light Novel Generation: Creating new content or expanding existing narratives in the style of Chinese and Japanese light novels.
  • Text Analysis: Analyzing and extracting information from extensive light novel texts.
  • Domain-Specific Fine-tuning: As a starting point for developing applications requiring deep understanding of light novel language and themes.