t-tech/T-pro-it-2.1 is a 32 billion parameter Russian-optimized language model built upon the Qwen 3 family, featuring a 32,768 token context length. Developed by t-tech, it offers significantly improved instruction following and advanced tool-calling capabilities, outperforming its predecessor T-pro-it-2.0 and Qwen3-32B in these areas. It is designed for both general tasks and complex agentic workflows, with efficient inference for Russian text.
No reviews yet. Be the first to review!