kadalicious22/snapgate-3B

Hugging Face
TEXT GENERATIONConcurrency Cost:1Model Size:3.1BQuant:BF16Ctx Length:32kPublished:Mar 25, 2026License:apache-2.0Architecture:Transformer Open Weights Warm

kadalicious22/snapgate-3B is a 3.1 billion parameter language model fine-tuned from Qwen2.5-3B by kadalicious22, optimized for customer service, summarization, and task execution. It supports both Indonesian and English business contexts, leveraging its 32,768 token context length for efficient processing. This model is specifically designed to handle customer inquiries, condense long texts, and execute structured text-based instructions.

Loading preview...

Overview

kadalicious22/snapgate-3B is a 3.1 billion parameter language model, fine-tuned from the Qwen2.5-3B base model. Developed by kadalicious22, it is specifically designed for business applications in both Indonesian and English, with a notable context length of 32,768 tokens.

Key Capabilities

  • Customer Service: Responds to customer inquiries in a friendly and solution-oriented manner.
  • Summarization: Efficiently condenses lengthy texts into concise bullet points.
  • Task Execution: Capable of executing structured, text-based instructions, suitable for agent-like functionalities.
  • Bilingual Support: Optimized for performance in both Indonesian and English languages.

Technical Details & Limitations

The model operates with approximately 3 billion parameters and uses float16 precision. It is built on the Transformers framework and stored in Safetensors format. Recommended hardware includes at least 8 GB GPU VRAM and 16 GB RAM. While effective for its target use cases, its performance on complex tasks may be lower than larger models, and it is not recommended for tasks requiring real-time or up-to-date knowledge. Performance in languages other than Indonesian and English is not guaranteed.