CHIH-HUNG/llama-2-13b-dolphin_5w
TEXT GENERATIONConcurrency Cost:1Model Size:13BQuant:FP8Ctx Length:4kPublished:Aug 25, 2023License:llama2Architecture:Transformer Open Weights Cold

CHIH-HUNG/llama-2-13b-dolphin_5w is a 13 billion parameter language model fine-tuned by CHIH-HUNG on the Meta Llama 2 architecture. It was trained using the first 50,000 entries of the ehartford/dolphin dataset, focusing on instruction-following tasks. This model demonstrates improved performance across benchmarks like ARC, HellaSwag, MMLU, and TruthfulQA compared to its base Llama-2-13b counterpart, making it suitable for general conversational and question-answering applications.

Loading preview...