nickoo004/queryshield-1.5b
QueryShield-1.5B by nickoo004 is a 1.5 billion parameter model, fine-tuned from Qwen2.5-1.5B-Instruct, designed to optimize raw user queries into detailed, structured instruction prompts for downstream LLMs. It supports prompt rewriting across 5 languages (English, Uzbek, Russian, Kazakh, Karakalpak) and 30 professional domains, including cross-lingual scenarios. Its primary use case is to enhance the performance of other LLMs by providing them with higher-quality, expert-level prompts.
Loading preview...
QueryShield-1.5B: Multilingual Prompt Optimizer
QueryShield-1.5B, developed by nickoo004, is a specialized 1.5 billion parameter model fine-tuned from Qwen2.5-1.5B-Instruct. Its core function is to act as an intermediary between users and downstream LLMs, transforming raw, often vague, user questions into highly structured and detailed instruction prompts. This optimization significantly improves the performance and relevance of responses from other LLMs.
Key Capabilities
- Prompt Optimization: Rewrites user queries into expert-level, detailed prompts for better LLM interaction.
- Multilingual Support: Operates across 5 languages: English, Uzbek, Russian, Kazakh, and Karakalpak, with robust support for cross-lingual scenarios.
- Domain Specialization: Optimized for 30 professional domains, including Software Engineering, Healthcare, Finance, and Agriculture, ensuring domain-specific prompt generation.
- Efficient Training: Achieved strong evaluation loss (0.967) on a custom multilingual dataset of 19,530 rows, with only 147M trainable parameters (8.7% of total).
Good for
- Improving the output quality of general-purpose LLMs by providing them with refined inputs.
- Applications requiring multilingual prompt generation, especially in Central Asian languages.
- Building intelligent agents that need to interpret user intent and formulate precise instructions for AI backends.
- Developers looking for a lightweight (1.5B parameters) solution for prompt engineering automation.