alpindale/WizardLM-2-8x22B
TEXT GENERATIONConcurrency Cost:4Model Size:141BQuant:FP8Ctx Length:32kPublished:Apr 16, 2024License:apache-2.0Architecture:Transformer0.4K Open Weights Warm

WizardLM-2 8x22B is a 141 billion parameter Mixture of Experts (MoE) large language model developed by WizardLM@Microsoft AI, built upon the Mixtral-8x22B-v0.1 base model. It is designed for complex chat, multilingual interactions, reasoning, and agent tasks, demonstrating highly competitive performance against leading proprietary models. This multilingual model excels in human preference evaluations across writing, coding, math, and reasoning, making it suitable for advanced conversational AI applications.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p