mylesgoose/Llama-3.1-70B-Instruct-abliterated
TEXT GENERATIONConcurrency Cost:4Model Size:70BQuant:FP8Ctx Length:32kPublished:Oct 2, 2024License:llama3.1Architecture:Transformer Warm

mylesgoose/Llama-3.1-70B-Instruct-abliterated is a 70 billion parameter instruction-tuned large language model developed by Meta, based on the Llama 3.1 architecture. It features a 32,768 token context length and is optimized for multilingual dialogue and general natural language generation tasks. This model excels in reasoning, code generation, and tool use, outperforming its predecessor Llama 3 70B Instruct on various benchmarks including MMLU, MATH, and API-Bank. It supports commercial and research use across multiple languages, including English, German, French, Italian, Portuguese, Hindi, Spanish, and Thai.

Loading preview...