future-architect/Llama-3.1-Future-Code-Ja-8B is an 8 billion parameter large language model developed by Future Corporation, built upon Meta Llama 3.1. It is continually pre-trained on a mixture of code and Japanese natural language data, primarily from The Stack V2 and LLM-jp Corpus v3. This model excels at code generation and completion in over 40 programming languages, supports Japanese and English, and features Fill-in-the-Middle (FIM) capability, outperforming original Llama 3.1 in code tasks and Qwen families in Japanese generation.
No reviews yet. Be the first to review!