Xclbr7/Arcanum-12b
TEXT GENERATIONConcurrency Cost:1Model Size:12BQuant:FP8Ctx Length:32kLicense:mitArchitecture:Transformer0.0K Open Weights Warm
Xclbr7/Arcanum-12b is a 12 billion parameter causal language model developed by Xclbr7, created by merging TheDrummer/Rocinante-12B-v1.1 and MarinaraSpaghetti/NemoMix-Unleashed-12B. This Transformer-based model is primarily in English and is optimized for conversational tasks with different personas. It features a 32768 token context length and was merged using the Ties method with specific density parameters and int8 masking.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
top_p
top_k
–
frequency_penalty
–
presence_penalty
–
repetition_penalty
min_p