abacusai/Smaug-2-72B
TEXT GENERATIONConcurrency Cost:4Model Size:72.3BQuant:FP8Ctx Length:32kPublished:Mar 29, 2024License:tongyi-qianwenArchitecture:Transformer0.0K Cold

The abacusai/Smaug-2-72B is a 72.3 billion parameter language model, fine-tuned from Qwen1.5-72B-Chat, specifically optimized for reasoning and coding tasks. It demonstrates improved performance over its base model on benchmarks like MT-Bench and HumanEval. This model is designed for applications requiring strong logical inference and code generation capabilities.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p