louisbrulenaudet/Pearl-7B-0210-ties
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 11, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Pearl-7B-0210-ties is a 7.24 billion parameter language model developed by louisbrulenaudet, created through a TIES-merging process of several 7B models including Pearl-7B-slerp, WizardMath-7B-V1.1, WestLake-7B-v2-laser, and NeuralTrix-7B-dpo. This model leverages the TIES-Merging method to efficiently combine task-specific models, addressing redundancy and resolving parameter conflicts. It is designed to consolidate diverse capabilities from its constituent models into a single multitask model.

Loading preview...