Delta-Vector/Hamanasu-Magnum-QwQ-32B
TEXT GENERATIONConcurrency Cost:2Model Size:32.8BQuant:FP8Ctx Length:32kPublished:Mar 13, 2025Architecture:Transformer0.0K Warm

Delta-Vector/Hamanasu-Magnum-QwQ-32B is a 32.8 billion parameter language model fine-tuned from Delta-Vector/Hamanasu-QwQ-V2-RP. It is specifically optimized to replicate the prose style of Claude models, including Opus and Sonnet, making it highly suitable for traditional roleplay scenarios. The model was trained for two epochs on 8x H100 GPUs and supports a context length of 131072 tokens.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p