0-hero/Matter-0.1-Slim-7B-B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 14, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

Matter-0.1-Slim-7B-B is a 7 billion parameter language model developed by 0-hero, based on a Mistral 7B architecture. It is a full-finetune on the Matter-0.1-Slim-B dataset, which is curated from over 35 datasets and analyzes more than 6 billion tokens. This model is specifically designed to support function calling, making it suitable for applications requiring structured interactions with external tools or APIs. It utilizes the ChatML prompt format for conversational AI tasks.

Loading preview...