Kukedlc/neuronal-7b-Mlab
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 12, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold
Kukedlc/neuronal-7b-Mlab is a 7 billion parameter language model created by Kukedlc, formed by merging mlabonne/NeuralDaredevil-7B and mlabonne/NeuralHermes-2.5-Mistral-7B using the slerp method. This model leverages the strengths of its base components, offering a 4096-token context length. It is designed for general text generation tasks, combining the capabilities of its constituent models.
Loading preview...