0-hero/Matter-0.1-Slim-7B-A

TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 13, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Matter-0.1-Slim-7B-A is a 7 billion parameter language model developed by 0-hero, fine-tuned from Mistral 7B. It is specifically trained on the Matter-0.1-Slim-A dataset, which is curated from over 35 datasets analyzing more than 6 billion tokens. This model is designed with integrated function calling capabilities, making it suitable for applications requiring structured interactions and tool use.

Loading preview...

Matter-0.1-Slim-7B-A Overview

Matter-0.1-Slim-7B-A is a 7 billion parameter language model, a fine-tuned variant of Mistral 7B, developed by 0-hero. It was trained for approximately 15 hours over 3 epochs on 4x A100 GPUs using Axolotl, leveraging the Matter-0.1-Slim-A dataset. This dataset comprises around 285,000 rows, meticulously curated from over 35 distinct datasets that collectively analyzed more than 6 billion tokens.

Key Capabilities

  • Function Calling: The model natively supports function calling, utilizing special tokens (<|begin_func|>, <|end_func|>, <|begin_func_response|>, <|end_func_response|>) for structured interaction with external tools or APIs.
  • ChatML Format: It adheres to the ChatML prompt format, ensuring compatibility with common conversational AI frameworks.
  • Specialized Training Data: Fine-tuned on a highly curated dataset, suggesting potential optimization for specific domains or tasks represented within the Matter dataset.

Good For

  • Applications requiring integrated function calling for tool use or API interaction.
  • Developers seeking a Mistral 7B-based model with specialized fine-tuning.
  • Use cases benefiting from a model trained on a diverse, large-scale curated dataset.