PEKOMS/Qwen3-1.7B-base-MED_0325
TEXT GENERATIONConcurrency Cost:1Model Size:2BQuant:BF16Ctx Length:32kPublished:Mar 25, 2026Architecture:Transformer Warm

PEKOMS/Qwen3-1.7B-base-MED_0325 is a 2 billion parameter base language model developed by PEKOMS, featuring a 32768 token context length. This model is a foundational component, designed for further fine-tuning or specific applications where a compact yet capable base model is required. Its primary utility lies in serving as a robust starting point for specialized tasks, leveraging its base architecture for diverse NLP challenges.

Loading preview...