aboros98/merlin1.4
TEXT GENERATIONConcurrency Cost:1Model Size:3BQuant:BF16Ctx Length:2kPublished:Mar 15, 2024License:mitArchitecture:Transformer Open Weights Cold
The aboros98/merlin1.4 is a 3 billion parameter language model with a 2048-token context length. Developed by aboros98, this model's specific architecture and training details are not provided in the available information. Its primary differentiators and intended use cases are currently undefined, as key benchmark metrics are pending.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
–
top_p
–
top_k
–
frequency_penalty
–
presence_penalty
–
repetition_penalty
–
min_p
–