CHIH-HUNG/llama-2-13b-OpenOrca_20w
TEXT GENERATIONConcurrency Cost:1Model Size:13BQuant:FP8Ctx Length:4kPublished:Aug 30, 2023License:llama2Architecture:Transformer Open Weights Cold

CHIH-HUNG/llama-2-13b-OpenOrca_20w is a 13 billion parameter language model fine-tuned by CHIH-HUNG on the Llama-2-13b-hf base model. It was trained using the first 200,000 entries of the OpenOrca dataset, focusing on instruction-following capabilities. This model demonstrates competitive performance across benchmarks like ARC, HellaSwag, MMLU, and TruthfulQA, making it suitable for general conversational AI and reasoning tasks.

Loading preview...