chargoddard/piano-medley-7b
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Dec 10, 2023License:cc-by-nc-4.0Architecture:Transformer0.0K Open Weights Cold

The chargoddard/piano-medley-7b is a 7 billion parameter language model developed by chargoddard, built upon the Mistral-7B-v0.1 architecture. This model is a TIES merge of several fine-tuned checkpoints, including loyal-piano-m7-cdpo and servile-harpsichord-cdpo, which were trained using cDPO with various binarized feedback datasets. It is instruction-tuned using the Alpaca prompt format and shows improved performance over its individual components in local benchmarks, making it suitable for general conversational AI tasks.

Loading preview...