openbmb/Eurus-7b-kto
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Apr 1, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Eurus-7B-KTO is a 7 billion parameter language model developed by OpenBMB, fine-tuned using KTO (Kahneman-Tversky Optimization) from Eurus-7B-SFT. It is optimized for reasoning tasks, particularly in math and coding, and demonstrates strong multi-turn interaction capabilities. This model achieves high performance among open-source models of similar sizes, outperforming larger baselines in various domains.

Loading preview...