sethuiyer/Qwen2.5-7B-Anvita
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kLicense:apache-2.0Architecture:Transformer0.0K Open Weights Warm

sethuiyer/Qwen2.5-7B-Anvita is a 7.6 billion parameter language model based on the Qwen2.5 architecture. This model demonstrates an average performance of 29.18 across various benchmarks, including IFEval, BBH, and MMLU-PRO. It is designed for general language understanding and generation tasks, with specific evaluation metrics provided for reasoning and knowledge-based tasks. The model's performance on metrics like IFEval (64.8) and MMLU-PRO (35.17) indicates its capabilities in instruction following and multi-task language understanding.

Loading preview...