ericflo/Qwen2.5-7B-Think-KTO-v0.1
TEXT GENERATIONConcurrency Cost:1Model Size:7.6BQuant:FP8Ctx Length:32kPublished:Jan 28, 2025License:apache-2.0Architecture:Transformer Open Weights Warm

The ericflo/Qwen2.5-7B-Think-KTO-v0.1 is a 7.6 billion parameter language model developed by Eric Florenzano, built upon the Qwen2.5-7B architecture. It is enhanced with Kahneman-Tversky Optimization (KTO) using binary feedback signals to improve reasoning capabilities. This model is specifically designed to generate responses with an explicit thought process, making it suitable for tasks requiring natural, human-like reasoning and clear thought-to-answer progression.

Loading preview...