zai-org/GLM-4-32B-Base-0414
TEXT GENERATIONConcurrency Cost:2Model Size:32BQuant:FP8Ctx Length:32kPublished:Apr 7, 2025License:mitArchitecture:Transformer0.0K Open Weights Warm

GLM-4-32B-Base-0414 is a 32 billion parameter base model from the GLM-4 series, developed by zai-org. Pre-trained on 15T high-quality data, including substantial reasoning-type synthetic data, it supports a 32768-token context length. This model excels in engineering code, artifact generation, function calling, search-based Q&A, and report generation, offering performance comparable to larger models like GPT-4o and DeepSeek-V3-0324 in specific benchmarks.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p