watt-ai/watt-tool-8B
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Dec 19, 2024License:apache-2.0Architecture:Transformer0.1K Open Weights Warm
watt-ai/watt-tool-8B is an 8 billion parameter language model based on LLaMa-3.1-8B-Instruct, specifically fine-tuned for advanced tool usage and multi-turn dialogue. It demonstrates state-of-the-art performance on the Berkeley Function-Calling Leaderboard (BFCL). This model excels at understanding complex user requests and orchestrating tool execution across extended conversations, making it ideal for AI workflow automation platforms.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
–
top_p
–
top_k
–
frequency_penalty
–
presence_penalty
–
repetition_penalty
–
min_p
–