tourist800/Mistral-7B-Merge-14-v0.2
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 28, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

Mistral-7B-Merge-14-v0.2 is a 7 billion parameter language model created by tourist800, formed by merging EmbeddedLLM/Mistral-7B-Merge-14-v0.1 and amazon/MistralLite using the slerp method. This merged model leverages the strengths of its base components, offering a versatile foundation for general language tasks. It is designed for applications requiring a balance of performance and efficiency within a 7B parameter footprint.

Loading preview...