malteos/hermeo-7b
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Dec 12, 2023License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

malteos/hermeo-7b is a 7 billion parameter causal decoder-only transformer language model, merged from DPOpenHermes-7B-v2 and leo-mistral-hessianai-7b-chat, both fine-tuned Mistral-7B-v0.1 variants. Developed by malteos, this model is optimized for strong performance in both English and German, demonstrating competitive results on German benchmarks like Hellaswag-DE and ARC-DE. It is designed for general text generation and conversational AI applications requiring bilingual capabilities.

Loading preview...