EVA-UNIT-01/EVA-Qwen2.5-32B-v0.0
TEXT GENERATIONConcurrency Cost:2Model Size:32.8BQuant:FP8Ctx Length:32kPublished:Oct 23, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Warm
EVA-Qwen2.5-32B-v0.0 is a 32 billion parameter full-parameter finetune of the Qwen2.5 architecture, developed by Kearm and Auri. This model specializes in roleplay and storywriting, leveraging an expanded data mixture for enhanced versatility, creativity, and narrative 'flavor'. It is optimized for generating engaging and nuanced long-form text in creative applications.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
top_p
top_k
–
frequency_penalty
–
presence_penalty
–
repetition_penalty
min_p
–