0-hero/Matter-0.1-Slim-7B-preview
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 15, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

The 0-hero/Matter-0.1-Slim-7B-preview is a 7 billion parameter language model, a full-finetune of Mistral 7B. It was trained on the slim-D version of the Matter dataset, curated from over 35 datasets analyzing more than 6 billion tokens. This model is specifically designed with robust function calling capabilities, making it suitable for applications requiring structured interaction with external tools and APIs. It utilizes the ChatML prompt format and has a context length of 4096 tokens.

Loading preview...