shahzebnaveed/StarlingHermes-2.5-Mistral-7B-slerp
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 16, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

StarlingHermes-2.5-Mistral-7B-slerp by shahzebnaveed is a 7 billion parameter language model created by merging NeuralHermes-2.5-Mistral-7B and Starling-LM-7B-alpha using the SLERP method. This model combines the strengths of its base components, offering a versatile foundation for various natural language processing tasks. Its architecture is based on the Mistral family, providing a 4096-token context window.

Loading preview...