dfurman/GarrulusMarcoro-7B-v0.1
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 11, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold
dfurman/GarrulusMarcoro-7B-v0.1 is a 7 billion parameter language model created by dfurman, formed by merging udkai/Garrulus and mlabonne/NeuralMarcoro14-7B. This model leverages a 4096-token context length and demonstrates strong general reasoning capabilities, achieving an average score of 74.20 on the Open LLM Leaderboard. It is particularly suitable for tasks requiring robust understanding and generation across various benchmarks.
Loading preview...