Ateron/Predonia-24B-V2.1
TEXT GENERATIONConcurrency Cost:2Model Size:24BQuant:FP8Ctx Length:32kArchitecture:Transformer0.0K Cold

Ateron/Predonia-24B-V2.1 is a 24 billion parameter language model with a 32768 token context length, developed by Ateron. This model utilizes a unique 'ties' merge method, combining elements from 'Precog' and 'Cydonia 4.3' models. It is designed to offer improved performance through this specific merging technique, making it suitable for tasks benefiting from advanced model fusion.

Loading preview...