yunjae-won/ubq30i_qwen4b_sft_both

TEXT GENERATIONConcurrency Cost:1Model Size:4BQuant:BF16Ctx Length:32kPublished:Apr 28, 2026Architecture:Transformer Cold

The yunjae-won/ubq30i_qwen4b_sft_both is a 4 billion parameter instruction-tuned language model. This model is shared by yunjae-won and is a fine-tuned variant, though specific details on its base architecture, training data, and primary differentiators are not provided in its current model card. Its general purpose is to serve as a conversational AI or for tasks requiring instruction following, typical of instruction-tuned LLMs.

Loading preview...

Model Overview

This model, yunjae-won/ubq30i_qwen4b_sft_both, is a 4 billion parameter language model that has been instruction-tuned. It is shared by yunjae-won on the Hugging Face Hub. The model card indicates it is a fine-tuned transformer model, but specific details regarding its base model, development, funding, or the languages it supports are currently marked as "More Information Needed."

Key Characteristics

  • Parameter Count: 4 billion parameters, suggesting a balance between performance and computational efficiency.
  • Instruction-Tuned: Implies it has been optimized for following instructions and engaging in conversational tasks.

Limitations and Recommendations

The model card explicitly states that more information is needed regarding its biases, risks, and limitations. Users are advised to be aware of these potential issues, and further recommendations will be provided once more details are available. The model's direct and downstream uses are also not specified, indicating a general-purpose instruction-tuned model without a highly specialized application defined yet.