oscarstories/lorastral24b_0604

TEXT GENERATIONConcurrency Cost:2Model Size:24BQuant:FP8Ctx Length:32kPublished:Jun 6, 2025License:mitArchitecture:Transformer0.0K Open Weights Cold

oscarstories/lorastral24b_0604 is a 24B parameter, German-language causal language model developed by HeyQQ GmbH, fine-tuned from Mistral Small 24B Instruct. It is specifically optimized for generating safe, engaging, and educational stories for children aged 6-12, adhering to a structured prompt framework. The model excels at producing age-appropriate narratives for primary education, demonstrating high readability and low gender bias in its outputs.

Loading preview...

Overview

oscarstories/lorastral24b_0604, or LORA, is a specialized 24B parameter language model developed by HeyQQ GmbH. It is a fine-tuned version of mistralai/Mistral-Small-24B-Instruct-2501, uniquely optimized for generating educational and engaging stories for children aged 6-12 in German. The model is designed to produce structured narratives with moral, cognitive, and pedagogical goals, making it ideal for primary education applications.

Key Capabilities

  • Age-Appropriate Storytelling: Generates short stories (exactly 3 paragraphs) tailored for children in grades 1-4.
  • German Language Focus: Optimized for German-language content, supporting educational environments.
  • Structured Prompting: Utilizes a specific system and user prompt framework to ensure consistent, simple, and safe story generation.
  • High Readability: Achieves Flesch Reading Ease scores above 70 and Wiener Sachtextformel scores below 5, indicating excellent readability for its target age group.
  • Low Bias: Demonstrates low systemic gender bias with GenBit scores well below the target threshold and near-balanced female-to-male representation.

Good for

  • Educational storytelling platforms targeting primary school children (grades 1-4).
  • Generating German-language content for child-friendly learning applications.
  • Creating structured narratives with specific pedagogical objectives.

LORA was trained on a curated dataset including Klexikon and KiwiThek, both German-language children's encyclopedias, ensuring high-quality, age-appropriate, and safe content generation.