jekunz/Qwen3-1.7B-sv-SmolTalk

TEXT GENERATIONConcurrency Cost:1Model Size:2BQuant:BF16Ctx Length:32kPublished:Apr 24, 2026Architecture:Transformer Cold

jekunz/Qwen3-1.7B-sv-SmolTalk is a 2 billion parameter language model fine-tuned from Qwen/Qwen3-1.7B. This model was trained using Supervised Fine-Tuning (SFT) with the TRL library. It is designed for general text generation tasks, leveraging its Qwen3 base architecture for conversational and creative prompts. The model supports a context length of 32768 tokens, making it suitable for processing longer inputs.

Loading preview...

Model Overview

jekunz/Qwen3-1.7B-sv-SmolTalk is a 2 billion parameter language model, fine-tuned from the base Qwen/Qwen3-1.7B architecture. This model has been specifically trained using Supervised Fine-Tuning (SFT) with the TRL library, aiming to enhance its performance on various text generation tasks.

Key Capabilities

  • General Text Generation: Capable of generating coherent and contextually relevant text based on user prompts.
  • Conversational AI: Suitable for engaging in dialogue and responding to open-ended questions, as demonstrated by the quick start example.
  • Qwen3 Base: Benefits from the robust architecture and pre-training of the Qwen3 series, providing a strong foundation for language understanding and generation.
  • Extended Context Window: Supports a context length of 32768 tokens, allowing it to process and generate longer sequences of text while maintaining coherence.

Training Details

The model underwent Supervised Fine-Tuning (SFT) using the TRL library. The training utilized specific versions of key frameworks:

  • TRL: 0.25.1
  • Transformers: 4.57.3
  • Pytorch: 2.9.1
  • Datasets: 4.4.1
  • Tokenizers: 0.22.1

Good For

  • Developers looking for a compact yet capable language model for text generation.
  • Applications requiring conversational abilities or creative text outputs.
  • Use cases where a 32K token context window is beneficial for handling longer inputs or maintaining extended dialogue history.