xiaolesu/qwen3-8b-sft-stmt-tk-v2
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Mar 22, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

The xiaolesu/qwen3-8b-sft-stmt-tk-v2 model is an 8 billion parameter language model, fine-tuned from Qwen/Qwen3-8B. It is specifically optimized for tasks related to the Lean 4 theorem prover, leveraging a specialized dataset for supervised fine-tuning. This model is designed to assist with formal mathematics and proof-related applications within the Lean 4 ecosystem, offering a 32768 token context length.

Loading preview...