EVA-UNIT-01/EVA-Qwen2.5-1.5B-v0.0
TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Dec 28, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Warm
EVA-UNIT-01/EVA-Qwen2.5-1.5B-v0.0 is a 1.5 billion parameter full-parameter fine-tuned model by Kearm, Auri, and Cahvay, based on the Qwen2.5 architecture with a 32768 token context length. It is specifically optimized as a small-scale specialist for roleplay and storywriting tasks. The model was trained on an expanded mixture of synthetic and natural data, including the Celeste 70B 0.1 data mixture, to enhance versatility, creativity, and "flavor" in its generated text.
Loading preview...