louisbrulenaudet/Pearl-7B-slerp
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 5, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Pearl-7B-slerp is a 7.24 billion parameter language model developed by louisbrulenaudet, created by merging mlabonne/OmniBeagle-7B and WizardLM/WizardMath-7B-V1.1 using Spherical Linear Interpolation (SLERP). This model is specifically optimized for mathematical tasks, demonstrating strong performance on the GSM8K benchmark. It offers a balanced performance across various benchmarks, making it suitable for applications requiring robust mathematical reasoning.

Loading preview...