viamr-project/qwen3-1.7b-amr-20260124-0130
TEXT GENERATIONConcurrency Cost:1Model Size:2BQuant:BF16Ctx Length:32kPublished:Jan 23, 2026License:apache-2.0Architecture:Transformer Open Weights Warm

The viamr-project/qwen3-1.7b-amr-20260124-0130 is a 2 billion parameter Qwen3-based causal language model developed by viamr-project, fine-tuned from unsloth/Qwen3-1.7B. This model was specifically trained using Unsloth, enabling 2x faster training. With a 40960 token context length, it is optimized for applications requiring efficient processing of long sequences.

Loading preview...