Vicooooo/job-radar-qwen3-4b-posttrain-sft

TEXT GENERATIONConcurrency Cost:1Model Size:4BQuant:BF16Ctx Length:32kPublished:Apr 19, 2026Architecture:Transformer Cold

Vicooooo/job-radar-qwen3-4b-posttrain-sft is a 4 billion parameter language model based on the Qwen architecture. This model has undergone post-training with Supervised Fine-Tuning (SFT). Its specific differentiators and primary use cases are not detailed in the provided information, suggesting it may be a foundational or general-purpose model awaiting further specialization or documentation.

Loading preview...

Overview

The Vicooooo/job-radar-qwen3-4b-posttrain-sft is a 4 billion parameter language model built upon the Qwen architecture. It has been subjected to a post-training phase involving Supervised Fine-Tuning (SFT).

Key Characteristics

  • Model Type: Qwen-based language model.
  • Parameter Count: 4 billion parameters.
  • Context Length: Supports a context window of 32768 tokens.
  • Training: Underwent a post-training Supervised Fine-Tuning (SFT) process.

Limitations and Further Information

The provided model card indicates that significant details regarding its development, specific capabilities, intended uses, training data, evaluation metrics, and potential biases are currently marked as "More Information Needed." Users should be aware that without this information, the model's precise strengths, weaknesses, and optimal applications remain undefined. Recommendations for use are pending further documentation.