Kukedlc/NeuralKrishna-7B-slerp
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 18, 2024License:apache-2.0Architecture:Transformer Open Weights Cold
NeuralKrishna-7B-slerp is a 7 billion parameter language model developed by Kukedlc, created by merging Neural4gsm8k and NeuralMaxime-7B-slerp using a slerp method. This model is designed for general language tasks, demonstrating strong performance across various benchmarks including reasoning, common sense, and mathematical problem-solving. With a context length of 4096 tokens, it is suitable for applications requiring robust understanding and generation capabilities.
Loading preview...