FinancialSupport/saiga-7b

Hugging Face
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Dec 28, 2023License:apache-2.0Architecture:Transformer0.0K Open Weights Warm

FinancialSupport/saiga-7b is a 7 billion parameter language model developed by FinancialSupport, inspired by other open-source Italian models. It is designed for general language tasks, with a focus on performance within a limited resource environment. The model demonstrates competitive average performance on the Open LLM Leaderboard, including strong results on HellaSwag and Winogrande benchmarks.

Loading preview...

FinancialSupport/saiga-7b Overview

FinancialSupport/saiga-7b is a 7 billion parameter language model developed by FinancialSupport, drawing inspiration from other open-source Italian models like fauno/camoscio and cerbero. This model was developed with limited resources, primarily during weekends, showcasing an efficient approach to model creation.

Key Capabilities & Performance

The model's performance on the Open LLM Leaderboard indicates its general utility across various language understanding and reasoning tasks. Key benchmark results include:

  • Average Score: 64.51
  • HellaSwag (10-Shot): 83.14
  • Winogrande (5-Shot): 79.01
  • AI2 Reasoning Challenge (25-Shot): 63.14
  • MMLU (5-Shot): 61.66
  • TruthfulQA (0-shot): 54.99
  • GSM8k (5-shot): 45.11

These scores suggest a balanced capability in common sense reasoning, multiple-choice question answering, and general knowledge tasks.

When to Use This Model

FinancialSupport/saiga-7b is suitable for applications requiring a 7B parameter model with a focus on general language understanding and generation, particularly for users interested in models developed with resource constraints. Its performance profile makes it a viable option for tasks where a solid baseline across various benchmarks is desired.