AbacusResearch/haLLawa4-7b
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 19, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

The AbacusResearch/haLLawa4-7b is a 7 billion parameter language model, created by AbacusResearch, formed by merging mlabonne/Monarch-7B, paulml/OGNO-7B, and AbacusResearch/haLLAwa3 using the DARE TIES method. This model demonstrates strong general reasoning capabilities, achieving an average score of 75.25 on the Open LLM Leaderboard, with notable performance in commonsense reasoning and mathematical tasks. It is designed for applications requiring robust language understanding and generation across various benchmarks.

Loading preview...