KaraKaraWitch/EurobeatVARemix-Qwen2.5-72b
TEXT GENERATIONConcurrency Cost:4Model Size:72.7BQuant:FP8Ctx Length:32kArchitecture:Transformer0.0K Cold

KaraKaraWitch/EurobeatVARemix-Qwen2.5-72b is a 72.7 billion parameter merged language model built upon the Qwen2.5-72B architecture, created by KaraKaraWitch. This model integrates multiple specialized Qwen2.5-72B variants using the Model Stock merge method, aiming to combine their strengths. It supports a wide array of languages including Chinese, English, French, Spanish, and more, making it suitable for diverse multilingual applications.

Loading preview...

EurobeatVARemix-Qwen2.5-72b Overview

EurobeatVARemix-Qwen2.5-72b is a 72.7 billion parameter language model developed by KaraKaraWitch. It is a merge of several Qwen2.5-72B-based models, utilizing the Model Stock merge method with Qwen/Qwen2.5-72B as its base. This approach aims to consolidate the capabilities of various fine-tuned models into a single, more versatile offering.

Key Merge Details

The model incorporates contributions from:

  • ZeusLabs/Chronos-Platinum-72B
  • EVA-UNIT-01/EVA-Qwen2.5-72B-v0.1
  • m8than/banana-2-b-72b
  • abacusai/Dracarys2-72B-Instruct
  • rombodawg/Rombos-LLM-V2.5-Qwen-72b

This diverse set of merged models suggests an intent to create a robust general-purpose model with enhanced performance across various tasks. The model is designed to be compatible with ChatML prompt formatting.

Multilingual Support

EurobeatVARemix-Qwen2.5-72b supports a broad range of languages, including but not limited to Chinese, English, French, Spanish, Portuguese, German, Italian, Russian, Japanese, Korean, Vietnamese, Thai, and Arabic, indicating its utility in multilingual contexts.

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p