Overview
oscarstories/lorastral24b_0604, or LORA, is a specialized 24B parameter language model developed by HeyQQ GmbH. It is a fine-tuned version of mistralai/Mistral-Small-24B-Instruct-2501, uniquely optimized for generating educational and engaging stories for children aged 6-12 in German. The model is designed to produce structured narratives with moral, cognitive, and pedagogical goals, making it ideal for primary education applications.
Key Capabilities
- Age-Appropriate Storytelling: Generates short stories (exactly 3 paragraphs) tailored for children in grades 1-4.
- German Language Focus: Optimized for German-language content, supporting educational environments.
- Structured Prompting: Utilizes a specific system and user prompt framework to ensure consistent, simple, and safe story generation.
- High Readability: Achieves Flesch Reading Ease scores above 70 and Wiener Sachtextformel scores below 5, indicating excellent readability for its target age group.
- Low Bias: Demonstrates low systemic gender bias with GenBit scores well below the target threshold and near-balanced female-to-male representation.
Good for
- Educational storytelling platforms targeting primary school children (grades 1-4).
- Generating German-language content for child-friendly learning applications.
- Creating structured narratives with specific pedagogical objectives.
LORA was trained on a curated dataset including Klexikon and KiwiThek, both German-language children's encyclopedias, ensuring high-quality, age-appropriate, and safe content generation.