CHIH-HUNG/llama-2-13b-FINETUNE2_TEST_2.2w
TEXT GENERATIONConcurrency Cost:1Model Size:13BQuant:FP8Ctx Length:4kPublished:Sep 4, 2023License:llama2Architecture:Transformer Open Weights Cold

CHIH-HUNG/llama-2-13b-FINETUNE2_TEST_2.2w is a 13 billion parameter language model fine-tuned from Meta's Llama-2-13b-hf. It was trained on approximately 22,000 data points from the huangyt/FINETUNE2_TEST dataset using LoRA with a context length of 4096 tokens. This model demonstrates improved performance on the HellaSwag and MMLU benchmarks compared to its base model, making it suitable for general language understanding and generation tasks.

Loading preview...