WebraftAI/synapsellm-7b-mistral-v0.5-preview2
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:8kPublished:Dec 9, 2023License:apache-2.0Architecture:Transformer Open Weights Cold

WebraftAI's SynapseLLM is a 7 billion parameter, decoder-only transformer model, finetuned from Mistral 7B v0.1. It is specifically optimized for code generation and general question-answering tasks, trained on a custom dataset including mathematical instructions, GPT-3.5 Q/A, and various code samples. This model aims to contribute to robust, generalized, and decentralized information systems, offering versatility for specific domain applications.

Loading preview...