nlpguy/AlloyIngotNeoY

TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 8, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

nlpguy/AlloyIngotNeoY is a 7 billion parameter language model created by nlpguy, merged using the task_swapping_ties method with ammarali32/multi_verse_model as its base. This model integrates capabilities from yam-peleg/Experiment26-7B, aiming to combine their strengths. It is designed for general language understanding and generation tasks, leveraging its merged architecture for potentially enhanced performance.

Loading preview...

Model Overview

nlpguy/AlloyIngotNeoY is a 7 billion parameter language model developed by nlpguy, constructed through a merge of existing pre-trained models. This model utilizes the task_swapping_ties merge method, a technique designed to combine the strengths of multiple models into a single, more capable entity.

Merge Details

The base model for this merge was ammarali32/multi_verse_model. It was merged with yam-peleg/Experiment26-7B, with specific weighting applied to each component. The configuration involved a diagonal_offset of 2.0 and a weight of 0.4 for yam-peleg/Experiment26-7B, and a weight of 0.6 for ammarali32/multi_verse_model across all 32 layers.

Potential Use Cases

Given its merged nature, AlloyIngotNeoY is likely suitable for a range of general-purpose natural language processing tasks, including:

  • Text generation
  • Summarization
  • Question answering
  • Conversational AI