tushar310/Hippy-AAI-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 14, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

Hippy-AAI-7B is a 7 billion parameter language model created by tushar310, formed by merging EmbeddedLLM/Mistral-7B-Merge-14-v0.1 and liminerity/M7-7b using the slerp method. This model leverages the strengths of its constituent Mistral-based models, offering a 4096-token context length. It is designed for general language tasks, benefiting from the combined capabilities of its merged components.

Loading preview...