Trappu/Magnum-Picaro-0.7-v2-12b
TEXT GENERATIONConcurrency Cost:1Model Size:12BQuant:FP8Ctx Length:32kPublished:Sep 12, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Warm

Trappu/Magnum-Picaro-0.7-v2-12b is a 12 billion parameter merged language model, combining Trappu/Nemo-Picaro-12B and anthracite-org/magnum-v2-12b, with a 32768 token context length. Developed by Trappu, this model is specifically designed to enhance creative writing, particularly for storytelling, scenario prompting, and roleplay, by stabilizing the specialized narrative capabilities of Picaro with Magnum's broader versatility. It aims to provide a balanced performance for both focused narrative generation and general creative text generation.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p