maywell/Synatra-11B-Tb2M_SM
TEXT GENERATIONConcurrency Cost:1Model Size:10.7BQuant:FP8Ctx Length:4kLicense:cc-by-nc-4.0Architecture:Transformer Open Weights Cold
Synatra-11B-Tb2M_SM is an 11 billion parameter language model developed by StableFluffy, built upon a merge of Mistral-7B-Instruct-v0.1 and CollectiveCognition-v1.1-Mistral-7B. This model is specifically licensed for non-commercial use under CC-BY-NC-4.0, prioritizing this over the Mistral Apache 2.0 license. Its primary differentiator is its strict non-commercial usage policy, requiring contact for any commercial applications.
Loading preview...
Synatra-11B-Tb2M_SM Overview
Synatra-11B-Tb2M_SM is an 11 billion parameter language model created by StableFluffy. It is a merge of two base models: mistralai/Mistral-7B-Instruct-v0.1 and teknium/CollectiveCognition-v1.1-Mistral-7B. The model was trained using A100 80GB * 4 GPUs.
Key Characteristics
- Merged Architecture: Combines Mistral-7B-Instruct-v0.1 and CollectiveCognition-v1.1-Mistral-7B.
- Non-Commercial License: Strictly licensed under CC-BY-NC-4.0, which takes precedence over the Mistral Apache 2.0 license. Commercial use requires direct contact with the developer.
Good for
- Non-commercial research and development: Ideal for personal projects, academic research, and experimentation where commercial application is not intended.
- Exploring merged Mistral-based architectures: Useful for developers interested in the performance characteristics of models built from specific Mistral-7B variants.