automerger/Experiment28Yam-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 10, 2024License:apache-2.0Architecture:Transformer Open Weights Cold
Experiment28Yam-7B is a 7 billion parameter language model created by Maxime Labonne, resulting from an automated merge using the DARE TIES method. This model combines yam-peleg/Experiment28-7B with mayacinka/yam-jom-7B-slerp, configured with specific density and weight parameters. It is designed for general text generation tasks, leveraging its merged architecture to potentially enhance performance over its base components. The model supports a 4096-token context length and is intended for applications requiring a compact yet capable LLM.
Loading preview...