andrijdavid/Macaroni-v2-7b
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 5, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

Macaroni-v2-7b by andrijdavid is a 7 billion parameter language model created by merging flemmingmiguel/MBX-7B-v3, mlabonne/OmniBeagle-7B, and vanillaOVO/supermario_v4 using the DARE TIES method, with mistralai/Mistral-7B-v0.1 as its base. This model leverages the strengths of its constituent models to offer a versatile foundation for various natural language processing tasks. Its 4096-token context length supports moderate-length interactions and text generation.

Loading preview...