Weyaxi/Instruct-v0.2-Seraph-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Dec 12, 2023License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Weyaxi/Instruct-v0.2-Seraph-7B is a 7 billion parameter instruction-tuned language model created by Weyaxi, built by merging Weyaxi/Seraph-7B and mistralai/Mistral-7B-Instruct-v0.2. Utilizing a slerp merge method, this model combines characteristics of its base components. It is designed for general instruction-following tasks, leveraging a 4096-token context length.

Loading preview...