Abhaykoul/Qwen1.5-0.5B-vortex
TEXT GENERATIONConcurrency Cost:1Model Size:0.6BQuant:BF16Ctx Length:32kPublished:Mar 11, 2024License:tongyi-qianwen-researchArchitecture:Transformer0.0K Warm
Abhaykoul/Qwen1.5-0.5B-vortex is a 0.6 billion parameter Qwen1.5-based chat model, fine-tuned by Abhaykoul. This model is a dealigned chat finetune of the original Qwen1.5-0.5B, trained on the Vortex mini dataset. It offers a compact solution for chat-oriented applications, maintaining competitive performance across various benchmarks for its size. Its primary strength lies in providing a small, efficient chat model derived from the Qwen family.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
–
top_p
–
top_k
–
frequency_penalty
–
presence_penalty
–
repetition_penalty
–
min_p
–