BAAI/Infinity-Instruct-7M-Gen-Llama3_1-70B
TEXT GENERATIONConcurrency Cost:4Model Size:70BQuant:FP8Ctx Length:8kPublished:Jul 25, 2024License:llama3.1Architecture:Transformer0.0K Warm

Infinity-Instruct-7M-Gen-Llama3_1-70B is a 70 billion parameter instruction-tuned language model developed by Beijing Academy of Artificial Intelligence (BAAI), based on the Llama 3.1 architecture with an 8192 token context length. It is fine-tuned on the Infinity-Instruct-7M and Infinity-Instruct-Gen datasets without reinforcement learning from human feedback (RLHF). This model demonstrates strong performance on benchmarks like AlpacaEval 2.0 and Arena-hard, showing favorable results compared to GPT-4 and Llama-3.1-70B-Instruct.

Loading preview...