lakshyaixi/Qwen2_5_1_5B_Group_Booking_SFT_v1
TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Jan 28, 2026License:apache-2.0Architecture:Transformer Open Weights Cold
The lakshyaixi/Qwen2_5_1_5B_Group_Booking_SFT_v1 is a 1.5 billion parameter Qwen2.5-based instruction-tuned causal language model developed by lakshyaixi. This model is specifically fine-tuned for group booking scenarios, leveraging Unsloth and Huggingface's TRL library for efficient training. It is optimized for tasks related to managing and processing group booking requests, offering a specialized solution for this domain.
Loading preview...
Model Overview
The lakshyaixi/Qwen2_5_1_5B_Group_Booking_SFT_v1 is a specialized 1.5 billion parameter language model, fine-tuned from the unsloth/Qwen2.5-1.5B-Instruct base model. Developed by lakshyaixi, this model focuses on enhancing performance for group booking-related tasks.
Key Capabilities
- Specialized Fine-tuning: The model has undergone Supervised Fine-Tuning (SFT) specifically for group booking use cases.
- Efficient Training: It was trained using Unsloth and Huggingface's TRL library, enabling 2x faster training compared to standard methods.
- Qwen2.5 Architecture: Built upon the Qwen2.5 architecture, providing a robust foundation for language understanding and generation.
Good For
- Group Booking Applications: Ideal for systems requiring automated processing, understanding, or generation of responses related to group bookings.
- Domain-Specific NLP: Suitable for developers looking for a compact yet powerful model tailored to a specific business domain.
- Resource-Efficient Deployment: Its 1.5 billion parameter size makes it a good candidate for applications where computational resources are a consideration, while still offering specialized performance.