laion/Kimi-K2T-ling-coder-sft-sandboxes-1-maxeps-32k
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kPublished:Jan 20, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

The laion/Kimi-K2T-ling-coder-sft-sandboxes-1-maxeps-32k model is an 8 billion parameter language model fine-tuned from Qwen/Qwen3-8B. It is specifically optimized for code generation and understanding, leveraging a fine-tuning dataset focused on coding tasks. This model is designed for applications requiring robust performance in programming-related contexts, offering a context length of 32768 tokens.

Loading preview...