luffycodes/vicuna-class-shishya-13b-ep3

TEXT GENERATIONConcurrency Cost:1Model Size:13BQuant:FP8Ctx Length:4kPublished:Dec 21, 2023License:llama2Architecture:Transformer Open Weights Cold

luffycodes/vicuna-class-shishya-13b-ep3 is a 13 billion parameter language model with a 4096-token context length, developed by luffycodes. This model is specifically fine-tuned as an education tutoring chatbot, based on learning science principles, as detailed in the 'CLASS Meet SPOCK' research. Its primary strength lies in providing educational assistance and interactive learning experiences.

Loading preview...

Model Overview

luffycodes/vicuna-class-shishya-13b-ep3 is a 13 billion parameter language model designed for educational tutoring. With a context length of 4096 tokens, this model is built upon the principles outlined in the research paper "CLASS Meet SPOCK: An Education Tutoring Chatbot based on Learning Science Principles" (arXiv:2305.13272).

Key Capabilities

  • Educational Tutoring: Specialized in providing interactive and principle-based educational assistance.
  • Learning Science Integration: Incorporates methodologies derived from learning science research to enhance tutoring effectiveness.
  • Contextual Understanding: Utilizes a 4096-token context window to maintain coherent and relevant educational dialogues.

Good For

  • Developing AI-powered educational chatbots.
  • Applications requiring intelligent tutoring systems.
  • Research into AI in education, particularly concerning learning science principles.

This model is particularly suited for scenarios where an AI needs to act as a knowledgeable and pedagogically sound tutor, guiding users through learning processes based on established educational research.