laion/kimi-k2t-freelancer-32ep-32k
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:32kLicense:apache-2.0Architecture:Transformer Open Weights Cold

The laion/kimi-k2t-freelancer-32ep-32k is an 8 billion parameter language model, fine-tuned from Qwen/Qwen3-8B. This model is specifically adapted using the penfever/kimi-k2t-freelancer-32ep-32k dataset, suggesting a specialization for tasks related to its training data. It features a substantial context length of 32,768 tokens, enabling it to process and generate longer sequences of text.

Loading preview...