gghfez/Magnum-v1-72b-Qwen2.5
TEXT GENERATIONConcurrency Cost:4Model Size:72.7BQuant:FP8Ctx Length:32kPublished:Oct 2, 2024License:otherArchitecture:Transformer0.0K Cold

gghfez/Magnum-v1-72b-Qwen2.5 is a 72 billion parameter language model based on the Qwen2.5 architecture, created by merging the creative capabilities of anthracite-org/magnum-v1-72b with the updated Qwen/Qwen2.5-72B-Instruct. This model is designed to offer enhanced creativity while retaining the general improvements of Qwen2.5, including improved coding ability and awareness of recent world events. It is particularly suited for tasks requiring creative text generation and general instruction following.

Loading preview...

Model Overview

gghfez/Magnum-v1-72b-Qwen2.5 is a 72 billion parameter language model that combines the strengths of two distinct models: the creative output of anthracite-org/magnum-v1-72b and the updated capabilities of Qwen/Qwen2.5-72B-Instruct. The model was created by extracting a LoRA from anthracite-org/magnum-v1-72b (which was originally based on Qwen2-72B-Instruct) and applying it to the newer Qwen/Qwen2.5-72B-Instruct base model.

Key Capabilities

  • Enhanced Creativity: Retains the "creative" output style of the original Magnum-v1 model.
  • Improved General Performance: Benefits from the advancements of the Qwen2.5 base, including better zero-shot coding abilities (e.g., generating a Python Snake game).
  • Updated World Knowledge: Demonstrates awareness of world events that occurred after the release of QwenV2, indicating more current training data or fine-tuning.
  • Instruction Following: Designed to follow instructions effectively, leveraging its Qwen2.5-Instruct foundation.

Good For

  • Creative Content Generation: Ideal for applications requiring imaginative and diverse text outputs.
  • General Purpose AI Tasks: Suitable for a broad range of instruction-following and conversational AI scenarios.
  • Code Generation: Capable of zero-shot code generation, making it useful for developer-centric applications.
Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p