lemon07r/RiverCub-Gemma-3-27B

Cold
Public
Vision
27B
FP8
32768
License: gemma
Hugging Face
Overview

Overview

RiverCub-Gemma-3-27B is a 27 billion parameter language model developed by lemon07r. It is a SLERP (Spherical Linear Interpolation) merge of two prominent Gemma-3 27B models: unsloth/gemma-3-27b-it and TheDrummer/Big-Tiger-Gemma-27B-v3. The creator's intent was to leverage the strengths of both base models, specifically aiming to preserve the quality of Google's official instruct-trained Gemma while integrating the benefits of other fine-tuned versions.

Key Characteristics

  • Architecture: Based on the Gemma-3 family, a powerful open-source model series.
  • Parameter Count: 27 billion parameters, offering a balance of performance and resource requirements.
  • Merge Method: Utilizes the SLERP merge technique, which is known for smoothly combining model weights.
  • Base Models: Merges unsloth/gemma-3-27b-it and TheDrummer/Big-Tiger-Gemma-27B-v3, selected after extensive testing to ensure high quality.
  • Context Length: Supports a context length of 32768 tokens.

Intended Use Cases

This model is designed for general-purpose conversational AI and instruction-following tasks. Its development focused on creating a robust model that avoids the common pitfalls of fine-tuning, where models can sometimes perform worse than their base counterparts. It is particularly suitable for applications requiring a capable 27B parameter model that retains strong instruction-following abilities.