udkai/Garrulus
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 9, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

udkai/Garrulus is a 7 billion parameter causal language model developed by udkai, based on mlabonne/NeuralMarcoro14-7B. This model has been intentionally optimized using Direct Preference Optimization (DPO) with a modified Winogrande dataset. It demonstrates improved performance on commonsense reasoning tasks like Winogrande, TruthfulQA, HellaSwag, and ARC challenge, making it suitable for applications requiring enhanced reasoning capabilities.

Loading preview...