Xiaojian9992024/Qwen2.5-Dyanka-7B-Preview
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Feb 25, 2025License:apache-2.0Architecture:Transformer0.0K Open Weights Warm

Qwen2.5-Dyanka-7B-Preview is a 7.6 billion parameter language model created by Xiaojian9992024 through a TIES merge of several Qwen2.5-7B-based models, including Rombos-LLM-V2.5-Qwen-7b and Clarus-7B-v0.1. This model leverages the Qwen2.5 architecture and is designed to combine the strengths of its constituent models. It is suitable for general language tasks, with its performance evaluated on the Open LLM Leaderboard.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p