hiieu/Meta-Llama-3-8B-Instruct-function-calling-json-mode
hiieu/Meta-Llama-3-8B-Instruct-function-calling-json-mode is an 8 billion parameter instruction-tuned causal language model, fine-tuned from Meta-Llama-3-8B-Instruct. Developed by hiieu, this model is specifically optimized for reliable function calling and JSON mode output. It excels at structured data generation and integration with external tools, making it suitable for agentic workflows and API interactions.
Loading preview...
Model Overview
This model, hiieu/Meta-Llama-3-8B-Instruct-function-calling-json-mode, is an 8 billion parameter instruction-tuned variant of Meta-Llama-3-8B-Instruct. It has been specifically fine-tuned to enhance its capabilities in two key areas: function calling and JSON mode output.
Key Capabilities
- Reliable JSON Mode: The model can generate responses strictly in JSON format, as demonstrated by its ability to produce structured outputs with a specified key.
- Function Calling: It supports a two-step inference process for function calling, allowing it to parse user queries, identify relevant functions, and generate appropriate function call arguments. This facilitates integration with external tools and APIs.
- Optimized Training: The model was trained using Unsloth and Huggingface's TRL library, resulting in a 2x faster training process.
Good For
- Structured Data Generation: Ideal for applications requiring consistent JSON output.
- Agentic Workflows: Suitable for building AI agents that need to interact with external systems via function calls.
- API Integration: Simplifies the process of converting natural language requests into API calls with structured arguments.
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.