bhavinjawade/SOLAR-10B-Nector-DPO-Jawade
TEXT GENERATIONConcurrency Cost:1Model Size:10.7BQuant:FP8Ctx Length:4kPublished:Jan 14, 2024License:mitArchitecture:Transformer0.0K Open Weights Cold
bhavinjawade/SOLAR-10B-Nector-DPO-Jawade is a 10.7 billion parameter language model, DPO-optimized and aligned, based on Upstage's SOLAR-10.7B-Instruct-v1.0 architecture. It was fine-tuned using Low-Rank Adaptation (LoRA) on a mixture of the Berkeley-nest Nectar dataset and the Intel DPO Orca dataset. This model is designed for instruction-following and generating helpful chatbot responses, particularly excelling in conversational AI tasks.
Loading preview...