KingNish/Qwen2.5-0.5b-Test-ft
TEXT GENERATIONConcurrency Cost:1Model Size:0.5BQuant:BF16Ctx Length:32kPublished:Sep 26, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Warm
KingNish/Qwen2.5-0.5b-Test-ft is a compact 0.5 billion parameter language model fine-tuned from Qwen/Qwen2.5-0.5B-Instruct. Developed by KingNish, this model is designed to answer a variety of questions and has demonstrated performance comparable to larger models like Llama 3.2 1B, particularly in specific reasoning tasks. It was specifically trained on 12,800 rows of the Magpie 300k Dataset, making it suitable for general question-answering applications where a smaller footprint is desired.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
–
top_p
–
top_k
–
frequency_penalty
–
presence_penalty
–
repetition_penalty
–
min_p
–