Biomimicry-AI/ANIMA-Nectar-v2

TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:8kPublished:Dec 4, 2023License:mitArchitecture:Transformer Open Weights Cold

Biomimicry-AI/ANIMA-Nectar-v2 is a 7 billion parameter language model developed by Biomimicry-AI with an 8192 token context length. This model's specific architecture and training details are not provided in the available documentation. Its primary differentiators and intended use cases are currently unspecified, as the model card indicates 'More Information Needed' across key sections.

Loading preview...

Model Overview

Biomimicry-AI/ANIMA-Nectar-v2 is a 7 billion parameter language model with an 8192 token context length. The model's developer is Biomimicry-AI. The provided model card is a template, indicating that detailed information regarding its architecture, training data, specific capabilities, and evaluation results is currently pending or not publicly disclosed.

Key Characteristics

  • Parameters: 7 billion
  • Context Length: 8192 tokens
  • Developer: Biomimicry-AI

Current Status and Limitations

As per the model card, significant details are marked as "More Information Needed." This includes:

  • Model type and underlying architecture
  • Language(s) supported
  • License information
  • Specific use cases (direct or downstream)
  • Bias, risks, and limitations
  • Training data and procedure details
  • Evaluation metrics and results

Users are advised that without further information, the model's intended applications, performance characteristics, and potential biases remain undefined. Recommendations emphasize that users should be aware of these missing details.

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p