Brouz/MaximalSlerp
TEXT GENERATIONConcurrency Cost:1Model Size:13BQuant:FP8Ctx Length:4kLicense:llama2Architecture:Transformer Open Weights Cold

Brouz/MaximalSlerp is a 13 billion parameter language model created by Brouz, formed by a Gradient Slerp merge of Gryphe/MythoLogic-L2-13b and The-Face-Of-Goonery/Huginn-13b-v1.2. This model leverages the Mergekit framework to combine the strengths of its base models, offering a unique blend of their capabilities. With a context length of 4096 tokens, it is designed for general-purpose text generation and understanding tasks.

Loading preview...