FoxyzGPT X1.1 1.7B Overview
FoxyzGPT X1.1 1.7B is a 1.7 billion parameter instruct-tuned language model, built upon the Qwen3-1.7B base model. Developed by Foxyz, its primary distinction is its specialized training on a private dataset to adopt a unique, human-like conversational persona, characterized by "silly language" and informal chat. The model is explicitly not designed for reasoning or tool-calling capabilities.
Key Characteristics & Usage
- Persona-driven Chat: Optimized for generating responses in a specific, informal, and often "silly" conversational style, mimicking a Discord direct message interaction.
- Base Model: Fine-tuned from Qwen/Qwen3-1.7B.
- Context Length: Supports a context length of 32768 tokens.
- Chat Format: Utilizes the ChatML format, with a specific system prompt recommended for optimal performance. The assistant's responses use a
~> notation to signal new messages, allowing for multi-line and multi-message outputs within a single turn. - Limited Scope: Due to its specialized training, the model is not accurate for general question-answering and is best used for its intended conversational role.
Ideal Use Cases
- Persona-based Chatbots: Excellent for applications requiring an AI with a distinct, informal, and engaging personality.
- Creative Conversational Agents: Suitable for generating playful or humorous dialogue in interactive scenarios.
- Discord-like Interactions: Specifically trained to simulate direct message conversations on platforms like Discord, making it ideal for similar social interaction simulations.