MrRobotoAI/8b-unaligned-BASE-v2c

TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:8kArchitecture:Transformer Cold

MrRobotoAI/8b-unaligned-BASE-v2c is an 8 billion parameter language model created by MrRobotoAI using the Model Stock merge method. This model is a merge of MrRobotoAI/Thor-v1.4-8b-DARK-FICTION as a base with multiple LoRA adapters, including those from kromcomp and ResplendentAI. It is designed to combine diverse capabilities from its constituent models, offering a broad range of potential applications.

Loading preview...

Model Overview

MrRobotoAI/8b-unaligned-BASE-v2c is an 8 billion parameter language model developed by MrRobotoAI. This model is a merge of several pre-trained language models and LoRA adapters, utilizing the advanced Model Stock merge method. The primary base model for this merge is MrRobotoAI/Thor-v1.4-8b-DARK-FICTION.

Key Characteristics

This model integrates capabilities from a diverse set of LoRA adapters, all applied to a base of MrRobotoAI/8b-unaligned-BASE-v2b. The merged components include:

  • kromcomp LoRAs: A wide array of adapters such as L3-Templar-r128-LoRA, L3.1-Baldur-r64-LoRA, L3-FantasyWriter-r64-LoRA, L3-BlueSerp-LoRA, L3-Smaug-r64-LoRA, and others like L3.1-Cakrawala-r128-LoRA and L3.1-Mistral-Data-r128-LoRA.
  • ResplendentAI LoRA: Includes Llama3_Aesir_Preview_LoRA_128.

Use Cases

Given its composition from multiple specialized LoRA adapters, this model is likely to exhibit a broad range of capabilities suitable for various generative and understanding tasks. Developers seeking a model that combines different strengths from several fine-tuned components may find this merge particularly useful for applications requiring versatility.