CraneAILabs/EduGanda-Gemma-3-1B

TEXT GENERATIONConcurrency Cost:1Model Size:1BQuant:BF16Ctx Length:32kPublished:May 5, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

EduGanda-Gemma-3-1B by CraneAILabs is a 1 billion parameter bilingual (English/Luganda) model built on the Gemma 3 1B architecture with a 32768 token context length. It is specifically designed for generating pedagogical content aligned with Uganda's P1–P3 curriculum, excelling in tasks like creating structured lesson plans and literacy assessments. This model is a linear weight interpolation (70% Learner + 30% GRPO-600) from a Luganda continually pre-trained base, optimized for educational AI in low-resource languages.

Loading preview...

EduGanda-Gemma-3-1B: Bilingual Pedagogical AI

EduGanda-Gemma-3-1B is a 1 billion parameter model developed by CraneAILabs, specifically engineered for educational content generation in Uganda. It is a bilingual model, proficient in both English and Luganda, and is built upon the Gemma 3 1B architecture.

Key Capabilities & Features

  • Bilingual Content Generation: Excels at creating pedagogical content in English and Luganda, aligned with Uganda's P1–P3 curriculum.
  • Unique Architecture: Achieved through a linear weight interpolation (70% Learner + 30% GRPO-600) from CraneAILabs/ganda-gemma-1b, which itself is a Luganda continual pre-training of google/gemma-3-1b-it.
  • Strong Pedagogical Performance: Achieves 66% on Pedagogical Content Knowledge (PCK) and 58.8% on Luganda Linguistic Understanding (ELL MC), significantly outperforming its unmodified base model.
  • No Additional Training: Notably, EduGanda is a merge of existing models, not a fine-tuned model, showcasing an innovative approach to model development.

Intended Use Cases

  • Generating structured bilingual lesson plans for Ugandan primary school teachers.
  • Creating literacy assessments (MCQ, fill-in-blank) tailored to the P1–P3 curriculum.
  • Functioning as an offline teacher assistant on mobile devices, offering approximately 30 tokens/second generation speed.
  • Research into educational AI for low-resource languages.

Known Limitations

  • Exhibits position bias in multiple-choice questions.
  • Struggles with short-form Luganda linguistic understanding and arithmetic tasks.
  • Luganda coherence degrades beyond ~500 tokens, and requires repetition_penalty=1.2 for stable generation.