SyedAbdul/test-7B-slerp
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 2, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

SyedAbdul/test-7B-slerp is a 7 billion parameter language model created by SyedAbdul, formed by merging OpenPipe/mistral-ft-optimized-1218 and cognitivecomputations/dolphin-2.6-mistral-7b-dpo using the slerp method. This model leverages the strengths of its base components, offering a blend of their respective optimizations. It is designed for general language tasks, benefiting from the combined training of its constituent models.

Loading preview...