Model Overview
This model, luffycodes/vicuna-mmlu-val-only-correct-mcq-7b-ep2, is a 7 billion parameter language model built upon the Vicuna architecture. Its primary development focus is on educational applications, specifically as a component of a tutoring chatbot designed around learning science principles.
Key Characteristics
- Architecture: Based on the Vicuna 7B model.
- Parameter Count: 7 billion parameters.
- Context Length: Supports a context window of 4096 tokens.
- Specialization: Fine-tuned for tasks involving multiple-choice questions, particularly within an educational context.
Intended Use Cases
This model is particularly suited for:
- Educational Tutoring Systems: Designed to power chatbots that provide educational assistance, especially in subjects requiring understanding of learning science principles.
- Multiple-Choice Question Answering: Excels at processing and responding to multiple-choice questions, making it valuable for quizzes, assessments, and interactive learning tools.
- Research in AI in Education: Useful for researchers exploring the application of large language models in pedagogical contexts, as detailed in the associated paper, "CLASS Meet SPOCK: An Education Tutoring Chatbot based on Learning Science Principles" (arXiv:2305.13272).