Qwen/Qwen3.5-27B
TEXT GENERATIONConcurrency Cost:2Model Size:27BQuant:FP8Ctx Length:32kPublished:Feb 24, 2026License:apache-2.0 Vision Architecture:Transformer0.8K Open Weights Warm

Qwen3.5-27B is a 27 billion parameter multimodal causal language model developed by Qwen, featuring a unified vision-language foundation and an efficient hybrid architecture. It excels in reasoning, coding, agentic tasks, and visual understanding, supporting a native context length of 262,144 tokens, extensible up to 1,010,000 tokens. The model is designed for global accessibility with expanded support for 201 languages and dialects, making it suitable for diverse applications requiring advanced multimodal and multilingual capabilities.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p