kekmodel/StopCarbon-10.7B-v2

TEXT GENERATIONConcurrency Cost:1Model Size:10.7BQuant:FP8Ctx Length:4kLicense:cc-by-nc-4.0Architecture:Transformer Open Weights Cold

kekmodel/StopCarbon-10.7B-v2 is an experimental 10.7 billion parameter causal language model created by kekmodel using mergekit. This model is a merge of upstage/SOLAR-10.7B-Instruct-v1.0 and VAGOsolutions/SauerkrautLM-SOLAR-Instruct, utilizing the ties merging method. It is designed for general instruction-following tasks, leveraging the combined strengths of its base models.

Loading preview...

kekmodel/StopCarbon-10.7B-v2 Overview

StopCarbon-10.7B-v2 is an experimental 10.7 billion parameter language model developed by kekmodel. It was created using the mergekit tool, combining two distinct base models to potentially enhance performance and capabilities.

Key Characteristics

  • Architecture: Based on the SOLAR-10.7B architecture, inheriting its foundational strengths.
  • Merging Method: Utilizes the ties merging method to integrate the weights of its constituent models.
  • Base Models: Merged from:
    • upstage/SOLAR-10.7B-Instruct-v1.0: A strong instruction-tuned model.
    • VAGOsolutions/SauerkrautLM-SOLAR-Instruct: Another instruction-tuned variant, likely contributing to specific language or instruction-following nuances.

Prompt Template

This model expects prompts formatted with specific ### User: and ### Assistant: tags, which is crucial for optimal interaction and response generation:

### User:
{user}

### Assistant:
{asistant}

Potential Use Cases

Given its instruction-tuned base models, StopCarbon-10.7B-v2 is likely suitable for a range of general-purpose instruction-following tasks, including question answering, text generation, summarization, and conversational AI, benefiting from the combined knowledge and instruction-following abilities of its merged components.