EldritchLabs/Altair-Stock-12B-v1

TEXT GENERATIONConcurrency Cost:1Model Size:12BQuant:FP8Ctx Length:32kPublished:Mar 5, 2026Architecture:Transformer0.0K Cold

EldritchLabs/Altair-Stock-12B-v1 is a 12 billion parameter language model created by EldritchLabs, merged using the Model Stock method. It is based on mistralai/Mistral-Nemo-Instruct-2407 and integrates capabilities from ten other specialized models. This merge aims to combine diverse strengths, making it suitable for a broad range of generative AI applications, particularly those benefiting from a blend of instruction-following and role-play oriented fine-tunes.

Loading preview...

Altair Stock 12B v1 Overview

EldritchLabs/Altair-Stock-12B-v1 is a 12 billion parameter language model developed by EldritchLabs. It was created using the Model Stock merge method, which combines multiple pre-trained language models into a single, more versatile model. The base model for this merge is mistralai/Mistral-Nemo-Instruct-2407.

Key Characteristics

This model integrates the strengths of ten distinct models, including:

  • Instruction-following: Leveraging the base Mistral-Nemo-Instruct model.
  • Specialized fine-tunes: Incorporating models like anthracite-org/magnum-v4-12b, ArliAI/Mistral-Nemo-12B-ArliAI-RPMax-v1.2, and SuperbEmphasis/MN-12b-RP-Ink-RP-Longform, which suggest a focus on enhanced role-playing, creative generation, and nuanced conversational abilities.

Use Cases

Given its diverse merge components, Altair Stock 12B v1 is well-suited for applications requiring a blend of capabilities, such as:

  • Creative writing and storytelling: Benefiting from models fine-tuned for role-play and long-form generation.
  • Instruction-based tasks: General-purpose instruction following inherited from its base model.
  • Complex conversational agents: Combining various specialized models for more human-like and engaging interactions.