moonshotai/Kimi-K2-Thinking
TEXT GENERATIONConcurrency Cost:4Model Size:1000BQuant:FP8Ctx Length:32kPublished:Nov 4, 2025License:modified-mitArchitecture:Transformer1.7K Open Weights Warm
Kimi K2 Thinking is a 1 trillion parameter Mixture-of-Experts (MoE) model developed by Moonshot AI, featuring 32 billion activated parameters and a 256K context window. This model is specifically designed as a thinking agent, excelling at multi-step reasoning and stable tool orchestration across hundreds of sequential calls. It achieves state-of-the-art performance on benchmarks like Humanity's Last Exam (HLE) and BrowseComp, while also supporting native INT4 quantization for lossless inference speed and memory efficiency.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p