cstr/Spaetzle-v12-7b
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 11, 2024License:cc-by-sa-4.0Architecture:Transformer0.0K Open Weights Cold
cstr/Spaetzle-v12-7b is a 7 billion parameter language model merged from flemmingmiguel/NeuDist-Ro-7B, Blizado/discolm-mfto-7b-german-v0.1, and ResplendentAI/Flora_DPO_7B, based on mayflowergmbh/Wiedervereinigung-7b-dpo-laser, using the LazyMergekit. This model is specifically optimized for German language tasks, showing a slight improvement over its predecessor, Spaetzle-v8-7b, in German benchmarks while maintaining a 4096-token context length. It achieves an EQ-Bench (de) score of 64.81, making it suitable for applications requiring strong German language understanding and generation.
Loading preview...