AurelPx/Dare-k-7B-ties
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 21, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold
AurelPx/Dare-k-7B-ties is a 7 billion parameter language model based on the Mistral-7B-v0.1 architecture, created by merging SamirGPT-v1 and Mistral-7B-Merge-14-v0.2 using the DARE TIES merging method. This model is designed to combine the strengths of its constituent models, offering a balanced performance across general language tasks. It supports a context length of 4096 tokens, making it suitable for applications requiring moderate input and output lengths.
Loading preview...