cstr/Spaetzle-v8-7b
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 10, 2024Architecture:Transformer0.0K Cold

The cstr/Spaetzle-v8-7b is a 7 billion parameter merged language model, built upon mayflowergmbh/Wiedervereinigung-7b-dpo-laser and incorporating flemmingmiguel/NeuDist-Ro-7B, johannhartmann/Brezn3, and ResplendentAI/Flora_DPO_7B. This model is designed for adequate performance in both German and English tasks, focusing on instruction following and reasoning while maintaining consistent behavior. It offers a 4096 token context length and is suitable for use cases where robust instruction following is prioritized over perfect German grammar and orthography.

Loading preview...