0-hero/Matter-0.1-Slim-7B-A
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 13, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold
Matter-0.1-Slim-7B-A is a 7 billion parameter language model developed by 0-hero, fine-tuned from Mistral 7B. It is specifically trained on the Matter-0.1-Slim-A dataset, which is curated from over 35 datasets analyzing more than 6 billion tokens. This model is designed with integrated function calling capabilities, making it suitable for applications requiring structured interactions and tool use.
Loading preview...