louisbrulenaudet/Pearl-7B-0211-ties
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 11, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold
Pearl-7B-0211-ties by louisbrulenaudet is a 7.24 billion parameter language model created using the TIES-Merging method, combining several 7B models including Pearl-7B-slerp, WizardMath-7B-V1.1, WestLake-7B-v2-laser, and NeuralTrix-7B-dpo. This merge technique aims to create a consolidated multitask model by addressing parameter redundancy and resolving sign conflicts across its constituent models. It demonstrates strong performance across various benchmarks, particularly excelling in HellaSwag, TruthfulQA, and Winogrande, making it suitable for diverse general-purpose language tasks.
Loading preview...