nbeerbower/Qwen2.5-Gutenberg-Doppel-14B
TEXT GENERATIONConcurrency Cost:1Model Size:14.8BQuant:FP8Ctx Length:32kPublished:Nov 11, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Warm

The nbeerbower/Qwen2.5-Gutenberg-Doppel-14B is a 14.8 billion parameter Qwen2.5-14B-Instruct base model, fine-tuned by nbeerbower using ORPO on Gutenberg-derived datasets. This model specializes in text generation, demonstrating strong performance on instruction following (IFEval) and general reasoning (BBH) tasks. It is designed for applications requiring robust language understanding and generation across multiple languages, including English, Chinese, French, and more.

Loading preview...