automerger/NeuralsirkrishnaExperiment26-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 13, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

NeuralsirkrishnaExperiment26-7B is a 7 billion parameter language model, an automated merge created by Maxime Labonne. This model was generated using the DARE TIES merging method, combining Kukedlc/NeuralSirKrishna-7b with rwitz/experiment26-truthy-iter-0, and is configured for bfloat16 precision. It is designed to leverage the strengths of its constituent models, offering a specialized blend of capabilities for various text generation tasks.

Loading preview...