arcee-ai/SuperNova-Medius
TEXT GENERATIONConcurrency Cost:1Model Size:14.8BQuant:FP8Ctx Length:32kPublished:Oct 2, 2024License:apache-2.0Architecture:Transformer0.2K Open Weights Warm
Arcee-SuperNova-Medius is a 14.8 billion parameter language model developed by Arcee.ai, built on the Qwen2.5-14B-Instruct architecture. This model leverages a cross-architecture distillation pipeline, combining knowledge from Qwen2.5-72B-Instruct and Llama-3.1-405B-Instruct to achieve high-quality instruction-following and complex reasoning. It is optimized for business use cases like customer support, content creation, and technical assistance, offering advanced capabilities in a resource-efficient package.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
top_p
top_k
–
frequency_penalty
–
presence_penalty
–
repetition_penalty
–
min_p
–