PistachioAlt/Synatra-MCS-7B-v0.3-RP-Slerp
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Dec 11, 2023License:cc-by-nc-4.0Architecture:Transformer0.0K Open Weights Cold
PistachioAlt/Synatra-MCS-7B-v0.3-RP-Slerp is a 7 billion parameter language model created by PistachioAlt, built using a Slerp merge of Q-bert/MetaMath-Cybertron-Starling and maywell/Synatra-7B-v0.3-RP. This model leverages the strengths of its base models, combining mathematical reasoning capabilities with roleplay optimization. It is designed for tasks requiring both logical processing and nuanced conversational interaction within its 4096-token context window.
Loading preview...