matteoangeloni/EduRaccoon
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Aug 23, 2025License:apache-2.0Architecture:Transformer Open Weights Cold
EduRaccoon by matteoangeloni is an 8 billion parameter large language model, fine-tuned from LLaMA 3.1 8B (Unsloth), designed as a multilingual educational assistant. It excels at providing structured, clear, and didactic responses to academic and school-related questions across various subjects like science, math, history, and literature. The model is optimized to respond in the language of the query and handles non-educational questions concisely.
Loading preview...