watt-ai/watt-tool-70B
TEXT GENERATIONConcurrency Cost:4Model Size:70BQuant:FP8Ctx Length:32kPublished:Dec 19, 2024License:apache-2.0Architecture:Transformer0.1K Open Weights Warm

watt-tool-70B is a 70 billion parameter language model developed by watt-ai, fine-tuned from LLaMa-3.3-70B-Instruct. It is specifically optimized for complex tool usage and multi-turn dialogue scenarios, achieving state-of-the-art performance on the Berkeley Function-Calling Leaderboard. This model excels at understanding user requests, selecting appropriate tools, and executing them across multiple conversational turns, making it ideal for AI workflow building platforms.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p