feelinrealcute/pym-13b7

TEXT GENERATIONConcurrency Cost:1Model Size:13BQuant:FP8Ctx Length:4kArchitecture:Transformer Cold

feelinrealcute/pym-13b7 is a 13 billion parameter language model, duplicated from TehVenom/Pygmalion-13b-Merged, with a context length of 4096 tokens. This model is specifically intended for use on Google Colab, suggesting an optimization for accessibility and ease of deployment in cloud-based environments. Its primary utility lies in applications compatible with the Pygmalion model family, often associated with conversational AI and character-driven interactions.

Loading preview...

Model Overview

feelinrealcute/pym-13b7 is a 13 billion parameter language model, directly duplicated from the TehVenom/Pygmalion-13b-Merged project. It features a context window of 4096 tokens, allowing for moderately long conversational turns or text generation tasks.

Key Characteristics

  • Parameter Count: 13 billion parameters, offering a balance between performance and computational requirements.
  • Context Length: 4096 tokens, suitable for maintaining coherence over extended interactions.
  • Origin: A direct copy of TehVenom/Pygmalion-13b-Merged, indicating its lineage and intended capabilities are aligned with the Pygmalion model family.

Intended Use

  • Google Colab Deployment: This model is specifically provided for use on Google Colab, making it accessible for researchers and developers leveraging cloud-based GPU resources without complex local setups.
  • Pygmalion-aligned Applications: Given its origin, it is likely optimized for tasks commonly associated with Pygmalion models, such as character AI, role-playing, and interactive storytelling.