3tic/Orion-Qwen3-1.7B-SFT-v2603
TEXT GENERATIONConcurrency Cost:1Model Size:2BQuant:BF16Ctx Length:32kPublished:Mar 27, 2026License:apache-2.0Architecture:Transformer Open Weights Warm

Orion-Qwen3-1.7B-SFT-v2603 by 3tic is a 2 billion parameter instruction-tuned translation model based on the Qwen3 architecture, specifically fine-tuned for light novel, game, and anime text. It supports glossary integration and context-aware translation, optimizing for consistent terminology and improved contextual understanding. With a 32768 token context length, this model excels at translating specialized Japanese content into Simplified Chinese, particularly for media-related texts.

Loading preview...