YeungNLP/firefly-qwen1.5-en-7b
TEXT GENERATIONConcurrency Cost:1Model Size:7.7BQuant:FP8Ctx Length:32kPublished:Feb 29, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold
YeungNLP/firefly-qwen1.5-en-7b is a 7.7 billion parameter causal language model fine-tuned from Qwen1.5-7B by YeungNLP. This model is specifically optimized as a helpful and harmless English AI assistant, demonstrating strong performance on the Open LLM Leaderboard. It was trained efficiently using QLoRA on a single V100 GPU, making it a competitive option for instruction-following tasks.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
–
top_p
–
top_k
–
frequency_penalty
–
presence_penalty
–
repetition_penalty
–
min_p
–