ojaffe/dfee6a-exp-077
TEXT GENERATIONConcurrency Cost:1Model Size:0.8BQuant:BF16Ctx Length:32kPublished:Apr 1, 2026Architecture:Transformer Cold

The ojaffe/dfee6a-exp-077 is a 0.8 billion parameter language model, fine-tuned from Qwen/Qwen3-0.6B using the TRL library. This model was specifically trained with the KTO (KTO: Model Alignment as Prospect Theoretic Optimization) method, which optimizes model alignment. It is designed for general text generation tasks, leveraging its fine-tuning approach for improved performance.

Loading preview...