andrijdavid/Macaroni-7b-Tied
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 19, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold
Macaroni-7b-Tied is a 7 billion parameter language model developed by andrijdavid, built upon the Mistral-7B-v0.1 base using the TIES merge method. This model integrates capabilities from four distinct 7B models, aiming for a balanced performance across various benchmarks. It achieves an average score of 74.96 on the Open LLM Leaderboard, demonstrating proficiency in reasoning, common sense, and language understanding tasks.
Loading preview...