nbeerbower/Mistral-Nemo-Gutenberg-Encore-12B
TEXT GENERATIONConcurrency Cost:1Model Size:12BQuant:FP8Ctx Length:32kPublished:Jun 4, 2025License:apache-2.0Architecture:Transformer0.0K Open Weights Warm

nbeerbower/Mistral-Nemo-Gutenberg-Encore-12B is a 12 billion parameter causal language model, fine-tuned by nbeerbower from mistralai/Mistral-Nemo-Instruct-2407. This model specializes in creative writing and narrative generation, particularly for fiction, through ORPO tuning on various DPO datasets focused on literary content. It offers enhanced prose style and thematic depth compared to its base model, making it suitable for generating imaginative and coherent stories.

Loading preview...