Kukedlc/Jupiter-k-7B-slerp
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 16, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold
Jupiter-k-7B-slerp is a 7 billion parameter language model created by Kukedlc, formed by merging Kukedlc/NeuralContamination-7B-ties, Kukedlc/NeuralTopBench-7B-ties, and Gille/StrangeMerges_32-7B-slerp using the TIES merging method. This model leverages a density and weight gradient approach during its merge configuration, building upon a base of Kukedlc/NeuralMaxime-7B-slerp. It is designed for general text generation tasks, offering a unique blend of characteristics from its constituent models.
Loading preview...