0-hero/Matter-0.1-7B-boost
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 19, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold
The 0-hero/Matter-0.1-7B-boost is a 7 billion parameter language model, fine-tuned from a Mistral 7B base, developed by 0-hero. This model is specifically trained on the extensive Matter dataset, curated from over 35 datasets and analyzing more than 6 billion tokens, with an additional boost finetuning. It is optimized for general conversational tasks and notably supports advanced function calling capabilities, making it suitable for applications requiring external tool interaction.
Loading preview...