AbacusResearch/jaLLAbi2-7b
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 20, 2024License:apache-2.0Architecture:Transformer Open Weights Cold
AbacusResearch/jaLLAbi2-7b is a 7 billion parameter language model created by AbacusResearch, formed by merging several existing 7B models using mergekit. This model achieves an average score of 75.06 on the Open LLM Leaderboard, demonstrating strong general reasoning and language understanding capabilities across various benchmarks. It is suitable for tasks requiring robust performance in areas like common sense reasoning, question answering, and mathematical problem-solving.
Loading preview...