CHIH-HUNG/llama-2-13b-FINETUNE1_17w-r16
TEXT GENERATIONConcurrency Cost:1Model Size:13BQuant:FP8Ctx Length:4kPublished:Sep 6, 2023License:llama2Architecture:Transformer Open Weights Cold

CHIH-HUNG/llama-2-13b-FINETUNE1_17w-r16 is a 13 billion parameter Llama-2-based language model fine-tuned by CHIH-HUNG. It was trained using the huangyt/FINETUNE1 dataset, comprising approximately 170,000 data points, with a LoRA rank of 16. This model demonstrates improved performance on benchmarks like MMLU and TruthfulQA compared to the base Llama-2-13b model, making it suitable for general language understanding and generation tasks.

Loading preview...