nbeerbower/Mistral-Nemo-Gutenberg-Doppel-12B-v2
TEXT GENERATIONConcurrency Cost:1Model Size:12BQuant:FP8Ctx Length:32kPublished:Oct 4, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Warm
nbeerbower/Mistral-Nemo-Gutenberg-Doppel-12B-v2 is a 12 billion parameter language model based on the Mistral-Nemo architecture. It was fine-tuned by nbeerbower using the ORPO method on the Gutenberg-DPO and gutenberg2-dpo datasets. This model is specialized for tasks benefiting from its unique training on literary and DPO-aligned text, making it suitable for nuanced text generation and understanding.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
top_p
top_k
–
frequency_penalty
–
presence_penalty
–
repetition_penalty
min_p
–