natezahedi/magnum-v2-32b

Cold
Public
32.5B
FP8
32768
Jan 21, 2026
License: tongyi-qianwen
Hugging Face
Overview

Magnum-v2-32b: Claude 3 Prose Replication

Magnum-v2-32b is a 32.5 billion parameter language model developed by natezahedi, built upon the Qwen1.5 32B architecture. Its primary objective is to emulate the high prose quality characteristic of Anthropic's Claude 3 Sonnet and Opus models, offering a sophisticated text generation capability.

Key Capabilities

  • Advanced Prose Generation: Designed to produce text with a quality comparable to Claude 3 models, focusing on nuance and sophistication.
  • Large Context Window: Supports a context length of 32768 tokens, enabling the processing and generation of extensive and complex narratives or conversations.
  • Instruction-Tuned: Utilizes ChatML formatting for instruction tuning, ensuring responsive and coherent interactions.

Good For

  • Creative Writing: Ideal for generating high-quality stories, articles, and other creative content that requires refined language.
  • Sophisticated Conversational AI: Suitable for chatbots and virtual assistants where nuanced and human-like dialogue is crucial.
  • Content Generation: Excellent for tasks requiring detailed and well-structured textual output across various domains.

Training Details

The model underwent full-parameter fine-tuning for 2 epochs using 8 NVIDIA H100 Tensor Core GPUs. The training leveraged a diverse set of filtered datasets, including Stheno, NobodyExistsOnTheInternet/claude_3.5s_single_turn_unslop_filtered, NobodyExistsOnTheInternet/PhiloGlanSharegpt, NobodyExistsOnTheInternet/Magpie-Reasoning-Medium-Subset, kalomaze/Opus_Instruct_25k, Nopm/Opus_WritingStruct, and a subset of Gryphe/Sonnet3.5-SlimOrcaDedupCleaned.