Adanato/llama3_8b_instruct_ppl_baseline-llama3_8b_instruct_ppl_bin_5
TEXT GENERATIONConcurrency Cost:1Model Size:8BQuant:FP8Ctx Length:8kPublished:Feb 15, 2026License:otherArchitecture:Transformer Cold
Adanato/llama3_8b_instruct_ppl_baseline-llama3_8b_instruct_ppl_bin_5 is an 8 billion parameter instruction-tuned language model, fine-tuned from Meta-Llama-3-8B-Instruct. This model is specifically adapted using the llama3_8b_instruct_ppl_bin_5 dataset, offering a base for further specialized applications. It maintains the 8192 token context length of its base model, making it suitable for tasks requiring moderate context understanding.
Loading preview...