Azazelle/Tippy-Toppy-7b
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 3, 2024License:cc-by-4.0Architecture:Transformer0.0K Open Weights Cold

Tippy-Toppy-7b by Azazelle is a 7 billion parameter language model created using a DARE merge method, building upon the Toppy-M-7b model. It integrates components from Mistral-7B-v0.1, Undi95/Toppy-M-7B, PistachioAlt/Noromaid-Bagel-7B-Slerp, and OpenPipe/mistral-ft-optimized-1227. This model is designed for general language tasks, leveraging its merged architecture to combine strengths from its constituent models, and supports a 4096-token context length.

Loading preview...