0-hero/Matter-0.2-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Apr 2, 2024License:apache-2.0Architecture:Transformer Open Weights Cold
Matter-0.2-7B is a 7 billion parameter language model developed by 0-hero, fine-tuned from Mistral 7B 0.2. It is trained on the Matter 0.2 dataset, curated from over 35 datasets analyzing more than 6 billion tokens. This model is designed for general helpful assistant tasks and notably supports function calling, enabling interaction with external tools. Its primary strength lies in its ability to integrate tool use within conversational flows.
Loading preview...