YuxinJiang/lion-7b
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:May 25, 2023License:mitArchitecture:Transformer0.0K Open Weights Cold

YuxinJiang/lion-7b is a 7 billion parameter language model developed by YuxinJiang, based on the LLaMA architecture. This model is a result of adversarial distillation, a technique designed to transfer knowledge from larger, proprietary models. It is primarily intended for research into efficient model training and knowledge distillation from black-box LLMs.

Loading preview...