qingy2024/SynGen-14B
TEXT GENERATIONConcurrency Cost:1Model Size:14BQuant:FP8Ctx Length:32kPublished:Jan 1, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

SynGen-14B by qingy2024 is a 14 billion parameter large language model based on Qwen3-14B, specifically designed for synthetic grounded reasoning generation. It excels at transforming chat datasets into reasoning datasets, mimicking styles like DeepSeek R1 or OpenAI's GPT OSS. With a 32K context length, this model is optimized for tasks requiring explicit reasoning between user prompts and final outputs, particularly for dataset modification and generation.

Loading preview...