aari1995/germeo-7b-laser
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 9, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

The aari1995/germeo-7b-laser is a 7 billion parameter causal decoder-only transformer language model developed by aari1995, merged from leo-mistral-hessianai-7b-chat and DPOpenHermes-7B-v2. This model is specifically designed for German-only speaking with strong English understanding capabilities, leveraging "laser" data for improved language comprehension. It excels in German language tasks while maintaining competitive English benchmark performance, making it suitable for applications requiring robust German generation and English comprehension.

Loading preview...