boods/Qwen-14B-MedFR
TEXT GENERATIONConcurrency Cost:1Model Size:14.8BQuant:FP8Ctx Length:32kPublished:Apr 9, 2026License:apache-2.0Architecture:Transformer Open Weights Cold
boods/Qwen-14B-MedFR is a 14.8 billion parameter Qwen2 model, fine-tuned by boods from unsloth/Qwen2.5-14B-unsloth-bnb-4bit. This model was trained using Unsloth and Huggingface's TRL library, achieving 2x faster training. It is designed for general language tasks, leveraging its Qwen2 architecture and efficient fine-tuning process.
Loading preview...
Model Overview
boods/Qwen-14B-MedFR is a 14.8 billion parameter language model, fine-tuned by boods. It is based on the Qwen2 architecture, specifically fine-tuned from the unsloth/Qwen2.5-14B-unsloth-bnb-4bit model.
Key Characteristics
- Architecture: Qwen2-based, leveraging the Qwen2.5 series.
- Parameter Count: 14.8 billion parameters, offering a balance between performance and computational efficiency.
- Training Efficiency: Fine-tuned using Unsloth and Huggingface's TRL library, resulting in 2x faster training compared to standard methods.
- Context Length: Supports a context length of 32768 tokens, suitable for processing longer inputs and generating coherent extended responses.
Potential Use Cases
- General Language Generation: Capable of various text generation tasks due to its large parameter count and Qwen2 foundation.
- Research and Development: Provides a robust base for further fine-tuning or experimentation in specific domains, benefiting from its efficient training methodology.
- Applications requiring extended context: Its 32K context window makes it suitable for tasks involving longer documents or conversations.