udkai/Turdus
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 12, 2024License:cc-by-nc-4.0Architecture:Transformer0.0K Open Weights Cold
udkai/Turdus is a 7 billion parameter language model developed by UDK dot AI, Daniel Devatman Hromada. It is a direct preference optimized version of NeuralMarcoro14-7B, fine-tuned for one epoch using a specially modified Winogrande dataset. This model demonstrates a subtle increase in average accuracy across non-Winogrande metrics like ARC, HellaSwag, and TruthfulQA, suggesting an unusual DPO contamination effect. It is primarily notable for its experimental methodology exploring the impact of specific DPO datasets on broader benchmark performance.
Loading preview...