jphme/orca_mini_v2_ger_7b
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jul 4, 2023License:cc-by-nc-sa-4.0Architecture:Transformer0.0K Open Weights Cold

The jphme/orca_mini_v2_ger_7b model is a 7 billion parameter variant of Pankaj Mathur's Orca Mini V2, specifically fine-tuned for the German language. Developed by jphme, this model is optimized for understanding and generating German text, building upon the original's explain-tuned datasets derived from WizardLM, Alpaca, and Dolly-V2. While its capabilities are currently limited by its experimental, small German dataset and parameter count, it demonstrates significantly improved German language proficiency compared to its base model, making it suitable for German-centric NLP tasks.

Loading preview...