Kukedlc/NeuralGanesha-7b
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 25, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

Kukedlc/NeuralGanesha-7b is a 7 billion parameter language model created by Kukedlc, formed by merging Kukedlc/SomeModelsMerge-7b and Kukedlc/MyModelsMerge-7b using the slerp merge method. This model leverages a 4096-token context length and is configured with specific parameter weightings for self_attn and mlp layers. It is designed as a general-purpose merged model, suitable for various text generation tasks.

Loading preview...