matteoangeloni/EduRaccoon

TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Aug 23, 2025License:apache-2.0Architecture:Transformer Open Weights Cold

EduRaccoon by matteoangeloni is an 8 billion parameter large language model, fine-tuned from LLaMA 3.1 8B (Unsloth), designed as a multilingual educational assistant. It excels at providing structured, clear, and didactic responses to academic and school-related questions across various subjects like science, math, history, and literature. The model is optimized to respond in the language of the query and handles non-educational questions concisely.

Loading preview...

EduRaccoon: Multilingual Educational AI Assistant

EduRaccoon, developed by matteoangeloni, is an 8 billion parameter large language model built upon LLaMA 3.1 8B (Unsloth). Its primary purpose is to function as a multilingual educational assistant, capable of responding in the language of the query (e.g., Italian, English, Spanish, French) with structured, clear, and didactic answers.

Key Capabilities

  • Multilingual: Responds in the language it is addressed in.
  • Education-Oriented: Specifically optimized for school and academic questions.
  • Broad Subject Coverage: Proficient in sciences, mathematics, history, literature, philosophy, civic education, and digital competencies.
  • Efficient Training: Optimized using Unsloth for 2x faster training and reduced memory usage.
  • Base Model: Fine-tuned from unsloth/meta-llama-3.1-8b-unsloth-bnb-4bit.
  • Training Data: Utilizes a dataset of approximately 100k educational prompts covering civic education, human rights, and digital competencies, trained with Supervised Fine-Tuning (SFT) via TRL.

Good For

  • Generating educational explanations and summaries.
  • Answering academic questions across a wide range of subjects.
  • Providing structured and clear responses in multiple languages for learning purposes.