hiieu/Meta-Llama-3-8B-Instruct-function-calling-json-mode
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:8kPublished:Apr 20, 2024Architecture:Transformer0.1K Warm
hiieu/Meta-Llama-3-8B-Instruct-function-calling-json-mode is an 8 billion parameter instruction-tuned causal language model, fine-tuned from Meta-Llama-3-8B-Instruct. Developed by hiieu, this model is specifically optimized for reliable function calling and JSON mode output. It excels at structured data generation and integration with external tools, making it suitable for agentic workflows and API interactions.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
top_p
–
top_k
–
frequency_penalty
–
presence_penalty
–
repetition_penalty
–
min_p
–