SrCh1nask1/X-Machina-7b-slerp-v0.0
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Apr 8, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

SrCh1nask1/X-Machina-7b-slerp-v0.0 is a 7 billion parameter language model created by SrCh1nask1, developed by merging occiglot/occiglot-7b-es-en-instruct and chihoonlee10/T3Q-DPO-Mistral-7B using a slerp method. This model combines the capabilities of its base models, offering a blend of instruction-following and potentially improved performance from the DPO-tuned Mistral variant. It is suitable for general text generation tasks within its 4096 token context window.

Loading preview...