Naphula-Archives/Checkpoint-T7-24B
TEXT GENERATIONConcurrency Cost:2Model Size:24BQuant:FP8Ctx Length:32kPublished:Mar 22, 2026Architecture:Transformer Cold

Checkpoint-T7-24B is a 24 billion parameter causal language model developed by Naphula-Archives, created using the della merge method. This model integrates several base models including Slimaki, Maginum Cydoms, Asmodeus v1, Asmodeus v2a, Asmodeus v2e, Magistry, and Checkpoint T6. It is noted for producing complex and tangential outputs, often using elaborate language to avoid direct instruction following, and has a context length of 32768 tokens.

Loading preview...