EVA-UNIT-01/EVA-Qwen2.5-14B-v0.1
TEXT GENERATIONConcurrency Cost:1Model Size:14.8BQuant:FP8Ctx Length:32kPublished:Oct 3, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

EVA-UNIT-01/EVA-Qwen2.5-14B-v0.1 is a 14.8 billion parameter, full-parameter fine-tuned Qwen2.5 model developed by Kearm and Auri. It specializes in roleplay and story writing, leveraging a mixture of synthetic and natural data, including an expanded Celeste 70B 0.1 data mixture. The model is optimized for versatility, creativity, and narrative 'flavor', with a 131072 token context length.

Loading preview...