Eric111/Mistral-7B-Instruct_v0.2_UNA-TheBeagle-7b-v1
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 7, 2024License:cc-by-nc-nd-4.0Architecture:Transformer0.0K Open Weights Cold

Eric111/Mistral-7B-Instruct_v0.2_UNA-TheBeagle-7b-v1 is a 7 billion parameter language model created by Eric111, merging Mistral-7B-Instruct-v0.2 and UNA-TheBeagle-7b-v1 using the SLERP method. This merged model leverages the strengths of its base components, offering a 4096-token context length. It is designed for general instruction-following tasks, combining the capabilities of a strong base model with a specialized fine-tune.

Loading preview...