CHIH-HUNG/llama-2-13b-dolphin_20w
TEXT GENERATIONConcurrency Cost:1Model Size:13BQuant:FP8Ctx Length:4kPublished:Aug 29, 2023License:llama2Architecture:Transformer Open Weights Cold

CHIH-HUNG/llama-2-13b-dolphin_20w is a 13 billion parameter language model fine-tuned by CHIH-HUNG on the Llama-2-13b base model. It was trained using the first 200,000 entries of the ehartford/dolphin dataset, focusing on instruction-following tasks. This model demonstrates competitive performance across benchmarks like ARC, HellaSwag, MMLU, and TruthfulQA, making it suitable for general conversational AI and instruction-based applications.

Loading preview...