Aryanne/TinyllamaMix-1.1B is a 1.1 billion parameter experimental language model created by Aryanne, built upon the TinyLlama architecture. This model utilizes a custom task-swapping and task-arithmetic merge method, aiming for improved performance in role-playing scenarios. It achieves an average score of 32.99 on the Open LLM Leaderboard, with notable results in HellaSwag and Winogrande.
No reviews yet. Be the first to review!