TheDrummer/Gemmasutra-Mini-2B-v1

Hugging Face
TEXT GENERATIONConcurrency Cost:1Model Size:2.6BQuant:BF16Ctx Length:8kPublished:Aug 3, 2024License:otherArchitecture:Transformer0.1K Warm

Gemmasutra-Mini-2B-v1 is a 2.6 billion parameter language model developed by BeaverAI, fine-tuned for immersive role-playing experiences. This model is designed to deliver satisfying performance even on resource-constrained devices like mobile phones and Raspberry Pi, supporting an 8192-token context length. It is notably uncensored and unaligned, making it suitable for diverse creative and unrestricted narrative generation, though it is not recommended for mathematical tasks.

Loading preview...

Gemmasutra Mini 2B v1: A Compact Role-Playing Powerhouse

Gemmasutra Mini 2B v1, developed by BeaverAI, is a 2.6 billion parameter model specifically fine-tuned for exceptional role-playing (RP) experiences. It challenges the notion that smaller models cannot provide engaging RP, aiming to deliver a satisfying experience across a wide range of devices.

Key Capabilities

  • Optimized for Role-Playing: Delivers high-quality, immersive role-playing narratives.
  • Resource-Efficient: Designed to run effectively on low-power hardware, including mobile phones (via Layla) and single-board computers like Raspberry Pi, as well as in-browser.
  • Uncensored and Unaligned: Provides unrestricted content generation for diverse creative applications.
  • Extended Context: Supports an 8192-token context length, allowing for longer and more complex interactions.
  • Flexible Usage: Works well with the Gemma Instruct template (with system role modification) and standard Chat Completion.

Good For

  • Users seeking a powerful yet lightweight model for local role-playing on various devices.
  • Creative writing and narrative generation requiring an uncensored model.
  • Developers targeting mobile or embedded systems for LLM applications.

Note: This model is not recommended for mathematical tasks.