aws-neuron/Qwen3-1.7B-TP4-BS4-SEQ2048
TEXT GENERATIONConcurrency Cost:1Model Size:2BQuant:BF16Ctx Length:32kPublished:Nov 9, 2025License:apache-2.0Architecture:Transformer Open Weights Warm

Qwen3-1.7B is a 1.7 billion parameter causal language model developed by Qwen, part of the Qwen3 series. This model uniquely supports seamless switching between a 'thinking mode' for complex reasoning, math, and coding, and a 'non-thinking mode' for efficient general dialogue. It demonstrates enhanced reasoning capabilities, superior human preference alignment, and strong agentic functionality, supporting over 100 languages and dialects.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p