Infermatic/magnum-v4-72b-FP8-Dynamic
TEXT GENERATIONConcurrency Cost:4Model Size:72.7BQuant:FP8Ctx Length:32kPublished:Oct 21, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Infermatic/magnum-v4-72b-FP8-Dynamic is a 72.7 billion parameter language model, dynamically quantized to FP8, based on anthracite-org's magnum-v4-72b. This model is fine-tuned on Qwen2.5-72B-Instruct with the goal of replicating the prose quality of Claude 3 models (Sonnet and Opus). It is optimized for generating high-quality, nuanced text, making it suitable for advanced conversational AI and creative writing applications.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p