namirocks/vicuna-tutor-shishya-model-7b-ep3

TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 27, 2024License:llama2Architecture:Transformer Open Weights Cold

The namirocks/vicuna-tutor-shishya-model-7b-ep3 is a 7 billion parameter language model based on the Vicuna architecture, fine-tuned for educational tutoring applications. Developed by namirocks, this model is specifically designed to function as a chatbot adhering to learning science principles, as detailed in the CLASS Meet SPOCK research. With a 4096-token context length, it is optimized for interactive, educational dialogue, making it suitable for academic support and tutoring systems.

Loading preview...

Overview

The namirocks/vicuna-tutor-shishya-model-7b-ep3 is a 7 billion parameter language model, building upon the Vicuna architecture. Its primary development focus is on creating an educational tutoring chatbot, informed by established learning science principles. This model is directly associated with the research presented in the paper "CLASS Meet SPOCK: An Education Tutoring Chatbot based on Learning Science Principles" by Sonkar et al. (2023).

Key Capabilities

  • Educational Tutoring: Specifically fine-tuned to engage in tutoring dialogues, aligning with pedagogical strategies.
  • Learning Science Integration: Designed with principles from learning science to enhance its effectiveness as an educational assistant.
  • Contextual Understanding: Supports a context length of 4096 tokens, enabling more coherent and extended tutoring sessions.

Good For

  • Developing AI-powered educational tools and chatbots.
  • Applications requiring interactive learning support and academic guidance.
  • Research into AI in education and intelligent tutoring systems, particularly those based on the CLASS Meet SPOCK framework.