jeiku/SOLAR_Uncensored_LimaRP_10.7B

TEXT GENERATIONConcurrency Cost:1Model Size:10.7BQuant:FP8Ctx Length:4kPublished:Feb 18, 2024Architecture:Transformer Cold

jeiku/SOLAR_Uncensored_LimaRP_10.7B is a 10.7 billion parameter language model, merged from w4r10ck/SOLAR-10.7B-Instruct-v1.0-uncensored and jeiku/Re-Host_Limarp_Mistral. This model, created using a linear merge method, is designed for general language generation tasks with a focus on uncensored responses. It offers a 4096-token context length, making it suitable for applications requiring moderate context understanding.

Loading preview...

Model Overview

jeiku/SOLAR_Uncensored_LimaRP_10.7B is a 10.7 billion parameter language model, resulting from a merge of two distinct base models: w4r10ck/SOLAR-10.7B-Instruct-v1.0-uncensored and jeiku/Re-Host_Limarp_Mistral. This model was constructed using the linear merge method via mergekit.

Key Characteristics

  • Parameter Count: 10.7 billion parameters.
  • Context Length: Supports a context window of 4096 tokens.
  • Merge Method: Utilizes a linear merge, combining the strengths of its constituent models.
  • Uncensored Nature: Inherits uncensored characteristics from one of its base models, making it suitable for applications requiring less restrictive content generation.

Use Cases

This model is well-suited for general language generation tasks where a 10.7B parameter model with a 4096-token context is appropriate. Its uncensored nature may be beneficial for specific research or creative applications that require unfiltered outputs.