Kukedlc/NeoCortex-7B-slerp
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 29, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Kukedlc/NeoCortex-7B-slerp is a 7 billion parameter language model created by Kukedlc, formed by merging Kukedlc/Neural4gsm8k and macadeliccc/WestLake-7B-v2-laser-truthy-dpo using a slerp method. This model leverages the strengths of its base components, with a context length of 4096 tokens. It is designed for general language generation tasks, benefiting from the combined capabilities of its merged predecessors.

Loading preview...