FelixChao/Patronum-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:8kPublished:Jan 30, 2024License:apache-2.0Architecture:Transformer Open Weights Cold
FelixChao/Patronum-7B is a 7 billion parameter language model developed by FelixChao. This model is designed for general language understanding and generation tasks, offering a balance between performance and computational efficiency. With an 8192-token context length, it is suitable for applications requiring moderate input and output lengths. Its primary strength lies in versatile text processing across various domains.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
top_p
–
top_k
–
frequency_penalty
–
presence_penalty
–
repetition_penalty
–
min_p
–