ystemsrx/Qwen2-Boundless
TEXT GENERATIONConcurrency Cost:1Model Size:1.5BQuant:BF16Ctx Length:32kPublished:Aug 19, 2024License:apache-2.0Architecture:Transformer0.1K Open Weights Warm
Qwen2-Boundless is a 1.5 billion parameter, fine-tuned Qwen2-1.5B-Instruct model developed by ystemsrx, featuring a 131072 token context length. It is specifically trained on a Chinese dataset to handle diverse and complex questions, including those involving ethical, illegal, pornographic, and violent content. This model is optimized for research and testing in scenarios requiring responses to sensitive topics, with a primary focus on Chinese language performance.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
–
top_p
–
top_k
–
frequency_penalty
–
presence_penalty
–
repetition_penalty
–
min_p
–