Tim419/Humpback_Myx
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Apr 14, 2025License:apache-2.0Architecture:Transformer Open Weights Cold

Tim419/Humpback_Myx is a 7 billion parameter Llama 2-based model specifically designed for instruction backtranslation, a technique for self-alignment. This unique model is trained in a reversed order, using outputs to predict instructions, making it a 'backward model' for research into self-alignment methods. It is optimized for reproducing the Self-Alignment with Instruction Backtranslation paper, focusing on English-only data from openassistant-guanaco.

Loading preview...