maywell/Synatra-7B-v0.3-base
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Oct 28, 2023License:cc-by-nc-4.0Architecture:Transformer0.0K Open Weights Cold

The maywell/Synatra-7B-v0.3-base is a 7 billion parameter causal language model developed by maywell, based on the Mistral-7B-Instruct-v0.1 architecture. This model is specifically fine-tuned for role-playing (RP) and general knowledge, with improved common sense understanding. It is designed for non-commercial use and supports both ChatML and Alpaca (No-Input) instruction formats.

Loading preview...