DreadPoor/Harpy-7B-Model_Stock
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:8kPublished:Apr 6, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold
Harpy-7B-Model_Stock is a 7 billion parameter language model developed by DreadPoor, created by merging Endevor/InfinityRP-v1-7B, macadeliccc/WestLake-7B-v2-laser-truthy-dpo, and abideen/AlphaMonarch-laser using the model_stock method. This model demonstrates strong general reasoning capabilities, achieving an average score of 75.51 on the Open LLM Leaderboard. It is suitable for a range of general-purpose natural language understanding and generation tasks, particularly excelling in areas like HellaSwag and Winogrande.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p