vaclavak/qwen-2.5-10k-ultrachat
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Mar 23, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

The vaclavak/qwen-2.5-10k-ultrachat model is an experimental 7.6 billion parameter causal language model based on the Qwen 2.5 architecture. Developed by vaclavak, it has been fine-tuned on 10,000 lines of the ultrachat_200k dataset, featuring a 32,768 token context length. This model is primarily intended for experimental use, exploring the impact of specific dataset fine-tuning on the Qwen 2.5 base.

Loading preview...