Kukedlc/NeuralArjuna-7B-DT
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 17, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

NeuralArjuna-7B-DT is a 7 billion parameter language model developed by Kukedlc, created through a merge of several existing 7B models using the dare_ties merge method. This model is designed to handle complex, abstract reasoning tasks, as demonstrated by its ability to generate detailed theoretical responses on topics like unifying quantum mechanics, relativity, and cosmic consciousness. It offers a balanced performance across various benchmarks, making it suitable for applications requiring nuanced understanding and generative capabilities.

Loading preview...