EVA-UNIT-01/EVA-Qwen2.5-7B-v0.1
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Oct 5, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

EVA-UNIT-01/EVA-Qwen2.5-7B-v0.1 is a 7.6 billion parameter full-parameter finetune of the Qwen2.5-7B architecture, developed by Kearm and Auri. This model is specifically optimized for roleplay and storywriting tasks, leveraging a diverse mixture of synthetic and natural data. It features an extended context length of 131072 tokens, making it suitable for generating detailed and creative narrative content.

Loading preview...