thanhdo881/qwen2.5-3b-vivu-travel-vn

TEXT GENERATIONConcurrency Cost:1Model Size:3.1BQuant:BF16Ctx Length:32kPublished:Apr 17, 2026Architecture:Transformer0.0K Cold

The thanhdo881/qwen2.5-3b-vivu-travel-vn is a 3.1 billion parameter Small Language Model (SLM) built on Qwen2.5-3B-Instruct with a 32k context length. Fine-tuned by thanhdo881 using LoRA for the Vietnamese Tourism Domain, it functions as ViVu, an intelligent travel assistant. This model is optimized for Advanced RAG pipelines, featuring strict anti-hallucination and efficient resource utilization, making it deployable on consumer-grade GPUs.

Loading preview...

Overview

thanhdo881/qwen2.5-3b-vivu-travel-vn is a 3.1 billion parameter Small Language Model (SLM) specifically fine-tuned for the Vietnamese Tourism Domain. Developed by thanhdo881, it is based on the Qwen2.5-3B-Instruct architecture and utilizes LoRA instruction-tuning via Unsloth. This model is designed to act as ViVu, an intelligent travel assistant, and supports both Vietnamese and English.

Key Capabilities

  • Strict Anti-Hallucination: The model is engineered to prevent fabrication, grounding all answers strictly within the provided context and politely declining out-of-scope queries.
  • RAG-Optimized: It excels at synthesizing information from Vector DB chunks into clean, structured Vietnamese, with support for Markdown formatting.
  • Resource Efficient: Its design allows for deployment on consumer-grade GPUs, such as the RTX 3060 or T4, due to its low VRAM footprint.
  • 32k Context Length: Provides a substantial context window for processing detailed travel-related information.

Good For

  • Building intelligent chatbots or virtual assistants for Vietnamese tourism.
  • Implementing advanced Retrieval-Augmented Generation (RAG) systems where factual accuracy and context adherence are critical.
  • Applications requiring a specialized language model for a specific domain (tourism) with efficient resource usage.