Aditya685/Upshot-NeuralHermes-2.5-Mistral-7B-slerp
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 4, 2024License:mitArchitecture:Transformer0.0K Open Weights Cold

Aditya685/Upshot-NeuralHermes-2.5-Mistral-7B-slerp is a 7 billion parameter language model created by Aditya685, merged using the slerp method from mlabonne/NeuralHermes-2.5-Mistral-7B and Aditya685/upshot-sih. This model leverages a 4096-token context length and is designed for general text generation tasks, combining the strengths of its base models. It is suitable for applications requiring a capable 7B model derived from a specific merging strategy.

Loading preview...