senseable/garten2-7b

TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 11, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Garten2-7B by senseable is a 7 billion parameter all-purpose Language Model (LLM) designed with a unique architecture for natural language understanding and generation. It delivers strong performance across various tasks, including conversation and content creation, as evidenced by its competitive scores on the Open LLM Leaderboard. This model is optimized for general utility in diverse AI applications.

Loading preview...

Garten2-7B: A Versatile 7B Language Model

Garten2-7B, developed by senseable, is a 7 billion parameter all-purpose Language Model (LLM) engineered for natural language understanding and generation. It features a unique architecture aimed at delivering robust performance across a broad spectrum of tasks.

Key Capabilities & Performance

This model demonstrates strong capabilities in general language tasks, as indicated by its evaluation on the Open LLM Leaderboard. It achieves an average score of 72.65, with notable results including:

  • AI2 Reasoning Challenge (25-Shot): 69.37
  • HellaSwag (10-Shot): 87.54
  • MMLU (5-Shot): 65.44
  • TruthfulQA (0-shot): 59.50
  • Winogrande (5-shot): 84.69
  • GSM8k (5-shot): 69.37

Good For

  • General-purpose language tasks: Its balanced performance across various benchmarks suggests suitability for a wide range of applications.
  • Conversation and content creation: Specifically designed to excel in these areas.
  • Applications requiring a smaller, efficient LLM: At 7 billion parameters, it offers a good balance of capability and computational efficiency.