sookjung/army_model_gemma2b

TEXT GENERATIONConcurrency Cost:1Model Size:2.5BQuant:BF16Ctx Length:8kPublished:Apr 15, 2026License:apache-2.0Architecture:Transformer Open Weights Cold

The sookjung/army_model_gemma2b is a 2.5 billion parameter causal language model developed by sookjung, based on the Google Gemma-2-2b architecture. This model is fine-tuned specifically for Korean language tasks, leveraging the sookjung/fintech_sample dataset. It is designed for text generation applications, offering a context length of 8192 tokens.

Loading preview...

Model Overview

The sookjung/army_model_gemma2b is a specialized language model developed by sookjung, building upon the google/gemma-2-2b base architecture. With 2.5 billion parameters and an 8192-token context window, this model is primarily focused on text generation tasks.

Key Characteristics

  • Base Model: Derived from Google's Gemma-2-2b, indicating a robust and efficient foundation.
  • Language Focus: Specifically fine-tuned for the Korean language, making it suitable for applications requiring strong Korean linguistic capabilities.
  • Training Data: Utilizes the sookjung/fintech_sample dataset, suggesting potential specialization or enhanced performance in fintech-related or similar domains within Korean.
  • Pipeline Tag: Configured for text-generation, indicating its primary intended use for generating coherent and contextually relevant text.

Intended Use Cases

This model is particularly well-suited for:

  • Generating Korean text in various applications.
  • Tasks requiring understanding and generation within specific domains if the fintech_sample dataset implies such specialization.
  • Applications where a compact yet capable Korean-centric language model is beneficial.