DRXD1000/Phoenix-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 10, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

DRXD1000/Phoenix-7B is a 7 billion parameter GPT-like model developed by Matthias Uhlig, fine-tuned from LeoLM/leo-mistral-hessianai-7b using Direct Preference Optimization (DPO). Specifically designed for the German language, it leverages translated instruction and DPO datasets to achieve strong performance. Phoenix-7B excels in German-language tasks, notably surpassing LeoLM/Llama-2-70b-chat in roleplay and reasoning categories on the MT-Bench-DE benchmark.

Loading preview...