0-hero/Matter-0.1-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 20, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Matter-0.1-7B is a 7 billion parameter language model, a full-finetune of Mistral 7B, developed by 0-hero. It is specifically optimized for function calling capabilities and general conversational tasks, trained on the extensive Matter dataset which comprises over 6 billion tokens from 35+ datasets. This model excels at integrating external tools and APIs into its responses, making it suitable for applications requiring dynamic interaction and information retrieval.

Loading preview...