Model Overview
luffycodes/llama-shishya-7b-ep3-v2 is a 7 billion parameter language model built upon the Llama architecture. It functions as a "student model" within the CLASS framework, which is designed for educational tutoring chatbots. This model integrates learning science principles to facilitate effective educational interactions.
Key Characteristics
- Architecture: Llama-based, 7 billion parameters.
- Context Length: Supports a context window of 4096 tokens.
- Framework: Developed using the CLASS framework, as detailed in the research paper "CLASS Meet SPOCK: An Education Tutoring Chatbot based on Learning Science Principles."
Primary Use Case
- Educational Tutoring: This model is specifically intended for use in chatbot applications focused on education, where it can act as a student model to simulate learning processes or assist in tutoring scenarios based on established learning science principles.
Citation
If you utilize this model in your work, please cite the associated research paper:
@misc{sonkar2023class,
title={CLASS Meet SPOCK: An Education Tutoring Chatbot based on Learning Science Principles},
author={Shashank Sonkar and Lucy Liu and Debshila Basu Mallick and Richard G. Baraniuk},
year={2023},
eprint={2305.13272},
archivePrefix={arXiv},
primaryClass={cs.CL}
}