Undi95/SynthiAthena-v2
TEXT GENERATIONConcurrency Cost:1Model Size:13BQuant:FP8Ctx Length:4kLicense:cc-by-nc-4.0Architecture:Transformer Open Weights Cold

Undi95/SynthiAthena-v2 is a 13 billion parameter language model, created by Undi95, resulting from a 50/50 merge of migtissera/Synthia-13B and IkariDev/Athena-v2. This model is designed for general language generation tasks, leveraging the combined strengths of its merged predecessors. With a 4096-token context length, it offers a balanced performance for various conversational and creative applications.

Loading preview...

Undi95/SynthiAthena-v2: A Merged 13B Language Model

Undi95/SynthiAthena-v2 is a 13 billion parameter language model developed by Undi95, specifically created for DarkReaperBoy. This model is a direct 50/50 merge of two distinct base models: migtissera/Synthia-13B and IkariDev/Athena-v2. The merging strategy aims to combine the strengths and characteristics of both foundational models into a single, cohesive unit.

Key Capabilities

  • General-purpose language generation: Designed to handle a broad spectrum of text-based tasks.
  • Leverages merged architectures: Benefits from the combined training and capabilities of Synthia-13B and Athena-v2.
  • 4096-token context window: Provides sufficient context for moderately complex interactions and text generation.

Good For

  • Exploratory language model applications: Ideal for users looking to experiment with a merged model's unique characteristics.
  • Conversational AI: Suitable for developing chatbots or interactive agents that require coherent and context-aware responses.
  • Creative writing and content generation: Can assist in generating diverse textual content based on prompts.