Entropicengine/Pinecone-Rune-12b
TEXT GENERATIONConcurrency Cost:1Model Size:12BQuant:FP8Ctx Length:32kPublished:Jul 1, 2025Architecture:Transformer0.0K Cold

Entropicengine/Pinecone-Rune-12b is a 12 billion parameter language model, part of the Pinecone Series, created by Entropicengine through a DARE TIES merge of DreadPoor/Irix-12B-Model_Stock, inflatebot/MN-12B-Mag-Mell-R1, and yamatazen/LorablatedStock-12B. This model is designed to be fast and lightweight while excelling in roleplay, general knowledge, intelligence, and creative writing. It offers a 32768 token context length, making it surprisingly capable for its size in diverse generative tasks.

Loading preview...