Overview
Model Overview
Josephgflowers/Cinder-Phi-2-V1-F16-gguf is a 3 billion parameter model built on the Phi-2 architecture, designed as an AI chatbot. It is specifically tailored for engaging users in scientific and educational conversations, providing companionship, and fostering imaginative exploration.
Key Capabilities & Training
This version of Cinder has been fine-tuned with a diverse dataset:
- Core Training Data: Similar to OpenHermes 2.5.
- Enhanced Reasoning: Supplemented with additional math, STEM, and reasoning data, primarily sourced from OpenOrca.
- Character-Specific Data: Includes Cinder character-specific data, a mix of RAG-generated Q&A covering world knowledge, STEM topics, and Cinder character traits. An abbreviated and edited Samantha dataset was also incorporated to further define the Cinder character, with negative responses largely removed.
Performance Benchmarks
Evaluations on the Open LLM Leaderboard show the following average scores:
- Overall Average: 58.86
- AI2 Reasoning Challenge (25-Shot): 58.28
- HellaSwag (10-Shot): 74.04
- MMLU (5-Shot): 54.46
- TruthfulQA (0-shot): 44.50
- Winogrande (5-shot): 74.66
- GSM8k (5-shot): 47.23
Additional evaluations indicate an average of 10.86 on a separate set of metrics, including IFEval (23.57) and BBH (22.45).
Good For
- Educational Applications: Ideal for scenarios requiring scientific and educational dialogue.
- Interactive Companionship: Suitable for applications where an AI companion is desired.
- Creative Exploration: Can be used for tasks that benefit from imaginative and exploratory interactions.