chasedreaminf/Dream-7B-slerp
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 16, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

Dream-7B-slerp is a 7 billion parameter language model created by chasedreaminf, formed by merging ignos/Mistral-T5-7B-v1 and Toten5/Marcoroni-neural-chat-7B-v2 using MergeKit. This model leverages the strengths of its constituent models to offer a versatile foundation for various natural language processing tasks. With a 4096-token context length, it is suitable for applications requiring moderate input and output sequences.

Loading preview...