tushar310/MisGemma-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 14, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

MisGemma-7B is a 7 billion parameter language model created by tushar310, formed by merging EmbeddedLLM/Mistral-7B-Merge-14-v0.1 and HuggingFaceH4/zephyr-7b-beta. This model leverages a slerp merge method, combining the strengths of its base models to offer a versatile language generation capability. It is designed for general-purpose text generation and understanding tasks, building upon the Mistral and Zephyr architectures.

Loading preview...