CHIH-HUNG/llama-2-13b-OpenOrca_5w
TEXT GENERATIONConcurrency Cost:1Model Size:13BQuant:FP8Ctx Length:4kPublished:Aug 24, 2023License:llama2Architecture:Transformer Open Weights Cold

CHIH-HUNG/llama-2-13b-OpenOrca_5w is a 13 billion parameter language model fine-tuned by CHIH-HUNG on the Llama-2-13b-hf architecture. It was trained using the first 50,000 entries of the OpenOrca dataset with LoRA, focusing on improving performance in reasoning and common sense benchmarks. This model demonstrates enhanced capabilities in tasks like ARC, HellaSwag, MMLU, and TruthfulQA compared to its base model.

Loading preview...