LumiOpen/Llama-Poro-2-70B-base
TEXT GENERATIONConcurrency Cost:4Model Size:70BQuant:FP8Ctx Length:8kPublished:May 27, 2025License:llama3.1Architecture:Transformer0.0K Cold

LumiOpen/Llama-Poro-2-70B-base is a 70.55 billion parameter decoder-only transformer model developed by AMD Silo AI, TurkuNLP, and HPLT. It is a continued pretraining of Llama 3.1 70B, specifically designed to add Finnish language capabilities while maintaining strong English, code, and math proficiency. Trained on 165 billion tokens with an 8192 token context length, this base model excels in multilingual performance, particularly for Finnish language tasks.

Loading preview...