AbacusResearch/haLLAwa2
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 12, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

AbacusResearch/haLLAwa2 is a 7 billion parameter language model merged from OpenPipe/mistral-ft-optimized-1227 and machinists/Mistral-7B-SQL, utilizing a 4096-token context length. This model is specifically optimized for tasks requiring strong reasoning and SQL capabilities, leveraging its merged architecture. It achieves an average score of 64.44 on the Open LLM Leaderboard, demonstrating proficiency across various benchmarks including MMLU and HellaSwag.

Loading preview...