Entropicengine/Pinecone-sage-24b
TEXT GENERATIONConcurrency Cost:2Model Size:24BQuant:FP8Ctx Length:32kPublished:Jun 30, 2025Architecture:Transformer0.0K Cold

Pinecone-Sage-24b is a 24 billion parameter merged language model from Entropicengine, part of the Pinecone Series. It is designed to offer a balance of speed and performance, excelling in rich prose generation, roleplay, general knowledge, and creative writing. This model was created using the DARE TIES merge method, combining Entropicengine/DarkTriad-24b and Entropicengine/Trifecta-Max-24b with darkc0de/XortronCriminalComputingConfig as a base.

Loading preview...