Koskath/arete-llama-3.2-3b_5

Hugging Face
TEXT GENERATIONConcurrency Cost:1Model Size:3.2BQuant:BF16Ctx Length:32kPublished:Jan 21, 2026Architecture:Transformer Warm

Koskath/arete-llama-3.2-3b_5 is a 3.2 billion parameter instruction-tuned language model developed by Konstantinos Katharakis, based on the Arete architecture. This model is specifically designed as an AI teaching assistant, providing step-by-step guidance to university students aligned with specific university course curricula. Its primary use case is educational support, offering tailored academic assistance.

Loading preview...

Koskath/arete-llama-3.2-3b_5: AI Teaching Assistant

This model, developed by Konstantinos Katharakis, is an instruction-tuned variant of the Arete-llama-3.2-3b architecture, featuring 3.2 billion parameters and a 32768 token context length. It is specifically designed to function as an AI teaching assistant, providing structured, step-by-step guidance to university students.

Key Capabilities

  • Curriculum-based guidance: Delivers academic support tailored to specific university course curricula.
  • Step-by-step instruction: Focuses on breaking down complex topics into manageable, sequential steps.
  • Educational support: Acts as a dedicated AI assistant for student learning.

Good for

  • Developing AI-powered educational tools for higher education.
  • Integrating into learning management systems for personalized student assistance.
  • Applications requiring detailed, structured explanations based on academic content.

For more details on the underlying research, refer to the paper: Small Language Models for Curriculum based guidance.