eunhyang/Qwen3-1.7B-base-MED
TEXT GENERATIONConcurrency Cost:1Model Size:2BQuant:BF16Ctx Length:32kPublished:Mar 25, 2026Architecture:Transformer Warm

The eunhyang/Qwen3-1.7B-base-MED is a 2 billion parameter base language model from the Qwen3 family, developed by eunhyang. With a substantial 32768-token context length, this model is designed for general language understanding and generation tasks. Its base nature suggests suitability for further fine-tuning across various applications requiring robust language processing capabilities.

Loading preview...