rwitz2/mergemix
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:8kLicense:apache-2.0Architecture:Transformer0.0K Open Weights Cold

rwitz2/mergemix is a DARE-TIES merged language model based on the Mistral-7B-v0.1 architecture. This model combines Mistral-7B-v0.1 with rwitz/go-bruins-v2, rwitz/dec10, and AIDC-ai-business/Marcoroni-7B-v3 using specific weights and densities. It is configured for bfloat16 dtype and includes an int8_mask parameter. The model's specific capabilities and primary use cases are not detailed in the provided information.

Loading preview...

Popular Sampler Settings

Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.

temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p