zai-org/GLM-4-32B-0414
TEXT GENERATIONConcurrency Cost:2Model Size:32BQuant:FP8Ctx Length:32kPublished:Apr 7, 2025License:mitArchitecture:Transformer0.5K Open Weights Cold

The GLM-4-32B-0414 is a 32 billion parameter model from the GLM family, pre-trained on 15T high-quality data including synthetic reasoning data. It excels in instruction following, engineering code generation, function calling, and agent tasks, with performance comparable to larger models like GPT-4o and DeepSeek-V3-0324 on specific benchmarks. This model is particularly optimized for complex reasoning and code-related applications, supporting user-friendly local deployment.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p