anthracite-org/magnum-v4-12b
TEXT GENERATIONConcurrency Cost:1Model Size:12BQuant:FP8Ctx Length:32kPublished:Oct 20, 2024License:apache-2.0Architecture:Transformer0.1K Open Weights Warm
anthracite-org/magnum-v4-12b is a 12 billion parameter causal language model fine-tuned by anthracite-org, based on mistralai/Mistral-Nemo-Instruct-2407, with a 32768 token context length. This model is specifically designed to replicate the prose quality of Claude 3 Sonnet and Opus models. It is optimized for generating high-quality, nuanced text, making it suitable for creative writing and advanced conversational AI applications.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p