Overview
The uukuguy/speechless-codellama-platypus-13b is a 13 billion parameter language model derived from Meta's CodeLlama-13b-hf. It has been fine-tuned by uukuguy using the Open-Platypus dataset, enhancing its capabilities as a "Tool LLM" for programming-related tasks. The model accepts prompts in the Alpaca instruction format, making it accessible for various code generation and understanding applications.
Key Capabilities
- Code Synthesis and Understanding: Designed for general code generation and comprehension.
- Instruction Following: Utilizes the Alpaca instruction format for structured prompting.
- Code Completion: Supports completing code snippets.
- Infilling: Capable of filling in missing parts of code.
Performance Metrics
Evaluations on the Open LLM Leaderboard show the model achieving an average score of 40.81. Specific metric scores include:
- ARC (25-shot): 46.16
- HellaSwag (10-shot): 68.88
- MMLU (5-shot): 44.55
- TruthfulQA (0-shot): 44.98
Intended Use Cases
This model is intended for commercial and research use in English and relevant programming languages. It can be adapted for a variety of code synthesis and understanding tasks, serving as a programming assistant. Developers should perform safety testing and tuning for specific applications due to the inherent risks and potential for inaccurate or objectionable responses from LLMs.