ReadyArt/4.2.0-Broken-Tutu-24b
TEXT GENERATIONConcurrency Cost:2Model Size:24BQuant:FP8Ctx Length:32kLicense:apache-2.0Architecture:Transformer0.0K Open Weights Cold

ReadyArt/4.2.0-Broken-Tutu-24b is a 24 billion parameter language model created by ReadyArt, merged using the DARE TIES method with TheDrummer/Cydonia-24B-v4.2.0 as its base. This model integrates components from several 24B models, including TroyDoesAI/BlackSheep-24B and multiple ReadyArt models, to enhance its overall capabilities. It features a 32768 token context length, making it suitable for applications requiring extensive contextual understanding and generation.

Loading preview...