EVA-UNIT-01/EVA-Qwen2.5-14B-v0.0
TEXT GENERATIONConcurrency Cost:1Model Size:14.8BQuant:FP8Ctx Length:32kPublished:Sep 30, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

EVA-UNIT-01/EVA-Qwen2.5-14B-v0.0 is a 14.8 billion parameter full-parameter fine-tune of the Qwen2.5 architecture, developed by Kearm and Auri. This model is specifically optimized for roleplay and story writing tasks, leveraging a diverse mixture of synthetic and natural data. With a substantial 131,072 token context length, it aims to enhance versatility, creativity, and narrative depth in generative text applications.

Loading preview...