Kukedlc/NeuralShiva-7B-DT
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 17, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold
Kukedlc/NeuralShiva-7B-DT is a 7 billion parameter language model created by Kukedlc, formed by merging several 7B models including YamShadow-7B and AlphaMonarch-7B using the DARE TIES merge method. This model is configured with a 4096 token context length and is designed for general text generation tasks, leveraging the combined strengths of its constituent models.
Loading preview...