aixonlab/Eurydice-24b-v3
TEXT GENERATIONConcurrency Cost:2Model Size:24BQuant:FP8Ctx Length:32kLicense:apache-2.0Architecture:Transformer0.0K Open Weights Warm

Eurydice 24b v3 by Aixon Lab is a 24 billion parameter causal language model built on Mistral 3.1, designed for multi-role conversations. It demonstrates exceptional contextual understanding and excels in creativity, natural conversation, and storytelling. Trained on a custom dataset, this model is intended for various natural language processing tasks, particularly excelling in chat, question-answering, and analysis.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p