WebraftAI/synapsellm-7b-mistral-v0.3-preview
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:8kPublished:Nov 29, 2023License:apache-2.0Architecture:Transformer0.0K Open Weights Warm

WebraftAI/synapsellm-7b-mistral-v0.3-preview is a 7 billion parameter decoder-only transformer model, finetuned by WebraftAI from Mistral-7b-v0.1. This model is specifically adapted for chat question-answering and code instruction tasks, utilizing a custom dataset that includes general code, Python code, and various Q/A scenarios. It is designed to contribute to robust, generalized, and decentralized information systems, excelling in conversational AI and code-related applications.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p