johnsnowlabs/BioLing-7B-Dare

Hugging Face
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:8kLicense:apache-2.0Architecture:Transformer0.0K Open Weights Warm

BioLing-7B-Dare is a 7 billion parameter language model developed by John Snow Labs, created by merging BioMistral/BioMistral-7B and Nexusflow/Starling-LM-7B-beta using the DARE TIES method. This model leverages a blend of specialized and general-purpose LLMs, featuring a context length of 8192 tokens. It is designed to combine the strengths of its base models, making it suitable for diverse natural language processing tasks.

Loading preview...

BioLing-7B-Dare: A Merged Language Model

BioLing-7B-Dare is a 7 billion parameter language model developed by John Snow Labs. This model is constructed using a novel merging technique called DARE TIES, combining two distinct base models: BioMistral/BioMistral-7B and Nexusflow/Starling-LM-7B-beta. The configuration specifies a density of 0.53 and weights of 0.4 and 0.3 for BioMistral and Starling-LM respectively, indicating a strategic blend of their capabilities.

Key Characteristics

  • Architecture: Merged model based on BioMistral/BioMistral-7B and Nexusflow/Starling-LM-7B-beta.
  • Parameter Count: 7 billion parameters.
  • Context Length: Supports an 8192-token context window.
  • Merging Method: Utilizes the dare_ties method for model combination, with int8_mask enabled and bfloat16 dtype.

Usage and Licensing

The model is available under a CC-BY-NC-ND license and adheres to John Snow Labs' Acceptable Use Policy. Commercial use requires specific licensing. Developers can easily integrate the model using the transformers library, with provided Python code examples for text generation tasks.

Evaluation

Evaluation results for BioLing-7B-Dare are currently pending and will be released soon.

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p