Matter-0.1-Slim-7B-C: A Function-Calling Optimized Mistral Finetune
Matter-0.1-Slim-7B-C is a 7 billion parameter model, built upon the Mistral 7B architecture, and has undergone a continued full-finetune. Its training leveraged the slim-C version of the Matter dataset, which is a comprehensive collection derived from over 35 distinct datasets, encompassing more than 6 billion tokens.
Key Capabilities
- Advanced Function Calling: This model is specifically engineered with robust support for function calling, enabling it to generate structured function calls and process function responses. It uses dedicated tokens (
<|begin_func|>, <|end_func|>, <|begin_func_response|>, <|end_func_response|>) for clear demarcation of function interactions. - ChatML Format: It adheres to the ChatML prompt format, ensuring compatibility with common chat-based interfaces and frameworks.
- Efficient Training: The model was trained for 3 epochs over approximately 17 hours using Axolotl on 4x A100 GPUs (80GB).
Ideal Use Cases
- Tool-Augmented Applications: Excellent for developing AI assistants that need to interact with external tools, APIs, or databases to fetch information or perform actions.
- Structured Output Generation: Suitable for scenarios where the model needs to produce structured outputs beyond natural language, such as JSON for function arguments.
- Conversational AI with External Integrations: Can power chatbots or virtual assistants that require dynamic information retrieval or task execution through function calls.