weezywitasneezy/OxytocinErosEngineeringF1-7B-slerp
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 24, 2024License:cc-by-nc-4.0Architecture:Transformer Open Weights Cold
OxytocinErosEngineeringF1-7B-slerp is a 7 billion parameter language model created by weezywitasneezy, formed by merging ChaoticNeutrals/Eris_Remix_7B and Virt-io/Erebus-Holodeck-7B using the slerp method. This model achieves an average score of 69.22 on the Open LLM Leaderboard, demonstrating capabilities across reasoning, common sense, and language understanding tasks. With a context length of 4096 tokens, it is suitable for general-purpose text generation and conversational AI applications.
Loading preview...