0-hero/Matter-0.1-7B-boost

TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 19, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

The 0-hero/Matter-0.1-7B-boost is a 7 billion parameter language model, fine-tuned from a Mistral 7B base, developed by 0-hero. This model is specifically trained on the extensive Matter dataset, curated from over 35 datasets and analyzing more than 6 billion tokens, with an additional boost finetuning. It is optimized for general conversational tasks and notably supports advanced function calling capabilities, making it suitable for applications requiring external tool interaction.

Loading preview...

Matter-0.1-7B-boost Overview

The 0-hero/Matter-0.1-7B-boost is a 7 billion parameter language model, building upon the Mistral 7B architecture. It has undergone a full fine-tuning process using the proprietary Matter dataset, which is an extensive collection derived from over 35 distinct datasets, encompassing an analysis of more than 6 billion tokens. This "boost" version includes additional fine-tuning data to enhance its capabilities.

Key Capabilities

  • Extensive Training Data: Fine-tuned on the Matter dataset, a large-scale, curated collection of over 6 billion tokens from diverse sources.
  • Function Calling Support: Designed to integrate with external tools and APIs through explicit function calling mechanisms. It utilizes special tokens (<|begin_func|>, <|end_func|>, <|begin_func_response|>, <|end_func_response|>) to manage function calls and responses within the conversation flow.
  • ChatML Format: Employs the ChatML prompt format, ensuring compatibility with common conversational AI frameworks.

Good For

  • Conversational AI: General-purpose chat applications and assistant roles.
  • Tool-Augmented Applications: Use cases requiring the model to interact with external functions, such as fetching real-time data, executing commands, or integrating with other software services.