0-hero/Matter-0.1-Slim-7B-C
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 15, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

0-hero/Matter-0.1-Slim-7B-C is a 7 billion parameter language model, a continued full-finetune of Mistral 7B. It was trained on the slim-C version of the Matter dataset, curated from over 35 datasets and analyzing more than 6 billion tokens. This model is specifically designed to support advanced function calling capabilities, making it suitable for applications requiring structured interactions with external tools and APIs. It utilizes the ChatML prompt format and has a context length of 4096 tokens.

Loading preview...