VAGOsolutions/SauerkrautLM-7b-HerO
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Nov 24, 2023License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

VAGOsolutions/SauerkrautLM-7b-HerO is a 7 billion parameter German-English bilingual language model based on the Mistral architecture, developed by VAGO solutions. It was created by merging Teknium's OpenHermes-2.5-Mistral-7B and Open-Orca's Mistral-7B-OpenOrca using the gradient SLERP method, then fine-tuned with a unique augmented German dataset. This model excels in German language understanding while retaining strong English capabilities, setting a new benchmark in bilingual proficiency without typical performance loss.

Loading preview...