daisd-ai/ner-on-merged
TEXT GENERATIONConcurrency Cost:1Model Size:4BQuant:BF16Ctx Length:32kPublished:Dec 15, 2025Architecture:Transformer Cold

The daisd-ai/ner-on-merged model is a 4 billion parameter language model created by daisd-ai using the Linear merge method via mergekit. It features a context length of 32768 tokens. This model is a merge of pre-trained language models, designed for general language understanding tasks.

Loading preview...

Model Overview

The daisd-ai/ner-on-merged model is a 4 billion parameter language model developed by daisd-ai. It was constructed using the mergekit tool, specifically employing the Linear merge method to combine pre-trained language models. This approach aims to leverage the strengths of multiple base models into a single, cohesive unit.

Key Characteristics

  • Parameter Count: 4 billion parameters, offering a balance between performance and computational efficiency.
  • Context Length: Supports a substantial context window of 32768 tokens, enabling processing of longer texts and maintaining coherence over extended conversations or documents.
  • Merge Method: Utilizes the Linear merge method, a technique for combining the weights of different models to create a new model with potentially enhanced capabilities.

Potential Use Cases

This model is suitable for a variety of natural language processing tasks where a merged model's combined knowledge can be beneficial. Its large context window makes it particularly useful for applications requiring deep understanding of lengthy inputs.