icefog72/IceSakeRP-7b
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jul 7, 2024License:cc-by-nc-4.0Architecture:Transformer0.0K Open Weights Cold

IceSakeRP-7b by icefog72 is a 7 billion parameter language model created using the SLERP merge method, combining several IceSakeV and IceCocoaRP models. Designed to handle a context window size of 25-32k tokens, this model is optimized for roleplay and creative text generation. It offers various quantized versions (Exl2 and GGUF) for efficient deployment and is suitable for applications requiring extended conversational context.

Loading preview...