flemmingmiguel/Distilled-HermesChat-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 12, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

flemmingmiguel/Distilled-HermesChat-7B is a 7 billion parameter language model created by flemmingmiguel, formed by merging openchat/openchat-3.5-0106 and argilla/distilabeled-Hermes-2.5-Mistral-7B. This model is an experimental merge designed to identify optimal base configurations for further fine-tuning. It leverages a slerp merge method with specific parameter adjustments for self-attention and MLP layers, making it suitable for general conversational AI tasks and as a foundation for specialized applications.

Loading preview...