Weyaxi/TekniumAiroboros-Nebula-7B

TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:8kPublished:Oct 8, 2023Architecture:Transformer Cold

Weyaxi/TekniumAiroboros-Nebula-7B is a 7 billion parameter language model developed by Weyaxi, featuring an 8192-token context length. This model is evaluated on the Open LLM Leaderboard, demonstrating general language understanding capabilities across various benchmarks. It is suitable for tasks requiring broad knowledge and reasoning, as indicated by its performance on ARC, MMLU, and HellaSwag.

Loading preview...

Model Overview

Weyaxi/TekniumAiroboros-Nebula-7B is a 7 billion parameter language model with an 8192-token context window, developed by Weyaxi. This model has been evaluated on the Hugging Face Open LLM Leaderboard, providing insights into its general performance across a range of academic benchmarks.

Key Capabilities

  • General Language Understanding: Achieves an average score of 52.82 on the Open LLM Leaderboard, indicating broad comprehension abilities.
  • Reasoning: Demonstrates performance in reasoning tasks with 57.17 on ARC (25-shot) and 55.25 on MMLU (5-shot).
  • Common Sense: Scores 81.72 on HellaSwag (10-shot) and 73.24 on Winogrande (5-shot), reflecting common sense reasoning.
  • Question Answering: Shows capabilities in question answering with 51.64 on TruthfulQA (0-shot) and 41.33 on DROP (3-shot).

Good For

  • Applications requiring a general-purpose language model with a moderate parameter count.
  • Tasks benefiting from a balance of reasoning, common sense, and factual recall.
  • Developers looking for a 7B model with documented performance on standard benchmarks for initial evaluation and fine-tuning.

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p