alnrg2arg/test_wanda_240109
TEXT GENERATIONConcurrency Cost:1Model Size:10.7BQuant:FP8Ctx Length:4kPublished:Jan 9, 2024License:cc-by-nc-4.0Architecture:Transformer Open Weights Warm

alnrg2arg/test_wanda_240109 is a 10.7 billion parameter language model, a pruned version of alnrg2arg/test. It is based on a combination of jeonsworld/CarbonVillain-en-10.7B-v2 and kyujinpy/Sakura-SOLAR-Instruct-DPO-v2, featuring a sparsity of 0.49. This model is designed for general language tasks, leveraging its pruned architecture for potentially more efficient inference.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p