SakanaAI/EvoLLM-JP-A-v1-7B
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 8, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

SakanaAI/EvoLLM-JP-A-v1-7B is an experimental 7 billion parameter general-purpose Japanese large language model developed by Sakana AI. This autoregressive model was created using an Evolutionary Model Merge method, combining Shisa Gamma 7B v1, Arithmo2 Mistral 7B, and Abel 7B 002. It is designed for research and development purposes, focusing on Japanese language tasks.

Loading preview...