THU-KEG/LongWriter-Zero-32B
TEXT GENERATIONConcurrency Cost:2Model Size:32.8BQuant:FP8Ctx Length:32kPublished:Jun 18, 2025License:apache-2.0Architecture:Transformer0.1K Open Weights Warm
LongWriter-Zero-32B is a 32 billion parameter large language model developed by THU-KEG, built upon Qwen 2.5-32B-Base. It is specifically designed for ultra-long text generation, capable of producing coherent passages exceeding 10,000 tokens. This model leverages 30 billion-token continual pretraining on long-form content and reinforcement learning with a composite reward function to enhance fluency, coherence, and length control. It excels at generating extended, structured text, matching or surpassing 100B-scale models in this specialized domain.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
top_p
top_k
–
frequency_penalty
–
presence_penalty
–
repetition_penalty
min_p
–