Undi95/CodeEngine
Undi95/CodeEngine is a 13 billion parameter language model based on jondurbin/airoboros-l2-13b-2.1, further fine-tuned with a LoRA adapter for code-related tasks. It features a 4096 token context length and is designed for general language understanding with an emphasis on code applications. The model demonstrates moderate performance across various benchmarks, including ARC, HellaSwag, and MMLU.
Loading preview...
Undi95/CodeEngine: A 13B Parameter Model for General and Code-Related Tasks
Undi95/CodeEngine is a 13 billion parameter language model built upon the jondurbin/airoboros-l2-13b-2.1 base model, enhanced with a LoRA adapter specifically for code-related functionalities. This model is designed to handle a variety of language understanding and generation tasks, leveraging its 4096 token context window.
Key Capabilities & Performance
Evaluated on the Open LLM Leaderboard, CodeEngine demonstrates a balanced performance across several benchmarks:
- Average Score: 50.96
- ARC (25-shot): 58.36
- HellaSwag (10-shot): 82.27
- MMLU (5-shot): 54.18
- TruthfulQA (0-shot): 45.18
- Winogrande (5-shot): 74.59
- GSM8K (5-shot): 1.52
- DROP (3-shot): 40.59
These scores indicate its proficiency in common sense reasoning, reading comprehension, and general knowledge tasks, while also highlighting areas for potential improvement, particularly in complex mathematical reasoning (GSM8K).
Good For
- General-purpose text generation and understanding: Suitable for a wide range of NLP tasks.
- Code-related applications: Benefits from its LoRA fine-tuning for code.
- Research and experimentation: Provides a solid base for further fine-tuning or task-specific adaptations.
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.