jeiku/Zephyr_beta_32k_7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 25, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

jeiku/Zephyr_beta_32k_7B is a 7 billion parameter language model created by jeiku, based on a merge of mistralai/Mistral-7B-Instruct-v0.2 and typeof/zephyr-7b-beta-lora. This model leverages the DARE TIES merge method to combine the strengths of its base components. It is designed for general instruction-following tasks, building upon the Mistral architecture.

Loading preview...