automerger/Experiment29Pastiche-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 10, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

automerger/Experiment29Pastiche-7B is a 7 billion parameter language model created by Maxime Labonne through an automated merge of yam-peleg/Experiment29-7B and CorticalStack/pastiche-crown-clown-7b-dare. This model leverages a slerp merge method across specific layer ranges, with a 4096 token context length. Its unique construction via automated merging aims to combine the strengths of its constituent models, making it suitable for general text generation tasks.

Loading preview...