paulilioaica/Hugo-7B-slerp
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 28, 2024License:apache-2.0Architecture:Transformer Open Weights Cold
paulilioaica/Hugo-7B-slerp is a 7 billion parameter language model merged from Mistral-7B-Instruct-v0.2 and CodeNinja-1.0-OpenChat-7B using the slerp method. This model demonstrates improved performance over its base models in specific benchmarks like ARC, MMLU, and Winogrande, while maintaining a 4096-token context length. It is designed for general conversational tasks with enhanced reasoning capabilities, particularly in areas where its merged components excel.
Loading preview...