huihui-ai/Qwen2.5-1.5B-Instruct-abliterated-SFT
Hugging Face
TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Apr 12, 2025License:apache-2.0Architecture:Transformer0.0K Open Weights Warm

Qwen2.5-1.5B-Instruct-abliterated-SFT is a 1.5 billion parameter instruction-tuned causal language model developed by huihui-ai. Fine-tuned from Qwen2.5-1.5B-Instruct-abliterated, this model utilizes the Guilherme34_uncensor dataset for supervised fine-tuning (SFT). It is designed for general conversational AI tasks, offering a balance of performance and efficiency for applications requiring a smaller, instruction-following model with a substantial 131072 token context length.

Loading preview...

Qwen2.5-1.5B-Instruct-abliterated-SFT Overview

This model, developed by huihui-ai, is a 1.5 billion parameter instruction-tuned large language model. It is fine-tuned from the huihui-ai/Qwen2.5-1.5B-Instruct-abliterated base model, leveraging the huihui-ai/Guilherme34_uncensor dataset for supervised fine-tuning (SFT). The model is licensed under Apache-2.0.

Key Capabilities

  • Instruction Following: Designed to accurately follow user instructions for various conversational tasks.
  • Efficient Performance: With 1.5 billion parameters, it offers a balance between performance and computational efficiency, suitable for deployment in resource-constrained environments.
  • Extended Context Length: Supports a substantial context window of 131072 tokens, enabling processing of longer inputs and maintaining conversational coherence over extended interactions.

Good For

  • General Conversational AI: Ideal for chatbots, virtual assistants, and interactive applications requiring robust instruction-following capabilities.
  • Resource-Efficient Deployments: Suitable for scenarios where larger models are impractical due to hardware limitations or latency requirements.
  • Applications Requiring Long Context: Beneficial for tasks that involve processing and generating responses based on extensive textual information.