Matter-0.1-Slim-7B-B Overview
Matter-0.1-Slim-7B-B is a 7 billion parameter language model, developed by 0-hero, that has been full-finetuned on a Mistral 7B base. Its training leveraged the extensive Matter-0.1-Slim-B dataset, which aggregates and curates data from over 35 distinct datasets, encompassing more than 6 billion tokens. The training process involved approximately 15 hours across 3 epochs using 4x A100 GPUs (80GB) with Axolotl.
Key Capabilities
- Function Calling Support: A primary differentiator, this model is explicitly designed to handle function calls, enabling it to interact with external tools and APIs. It includes dedicated tokens (
<|begin_func|>, <|end_func|>, <|begin_func_response|>, <|end_func_response|>) for structured function invocation and response parsing. - ChatML Prompt Format: The model is configured to use the ChatML prompt format, facilitating clear and structured conversational interactions with system, user, and assistant roles.
Good For
- Applications requiring tool use: Ideal for scenarios where the AI needs to perform actions or retrieve information via external functions, such as integrating with databases, APIs, or other software components.
- Conversational agents with structured interaction needs: Suitable for building chatbots or assistants that require not just generating text, but also executing specific commands based on user input.