Cartinoe5930/TIES-Merging
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 18, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold
TIES-Merging by Cartinoe5930 is a 7 billion parameter language model created by merging three Mistral-7B-Instruct-v0.2 based models: Open-Orca/Mistral-7B-OpenOrca, openchat/openchat-3.5-0106, and WizardLM/WizardMath-7B-V1.1. This merge leverages the TIES-merging method to combine their strengths, resulting in a model with a 4096-token context length. It is designed to offer a balanced performance across general instruction following, conversational AI, and mathematical reasoning tasks.
Loading preview...