Unbabel/Tower-Plus-2B is a 2.6 billion parameter multilingual large language model built upon Gemma 2 2B. Developed by Unbabel, it undergoes continuous pretraining, instruction tuning, and weighted preference optimization, incorporating parallel and multilingual data across 22 languages. This model excels in multilingual tasks, particularly machine translation, and supports a context length of 8192 tokens.
No reviews yet. Be the first to review!