Weyaxi/openchat-3.5-1210-Seraph-Slerp
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Dec 27, 2023License:apache-2.0Architecture:Transformer0.0K Open Weights Cold
Weyaxi/openchat-3.5-1210-Seraph-Slerp is a 7 billion parameter language model created by Weyaxi, merged using the slerp method from openchat/openchat-3.5-1210 and Weyaxi/Seraph-7B, based on Mistral-7B-v0.1. This model leverages a specific slerp merging strategy to combine the strengths of its base models, making it suitable for general conversational AI tasks. Its 4096-token context length supports moderate-length interactions.
Loading preview...