Saxo/Linkbricks-Horizon-AI-Japanese-Pro-V5-70B
Saxo/Linkbricks-Horizon-AI-Japanese-Pro-V5-70B is a 70 billion parameter Japanese-enhanced language model developed by Mr. Yunsung Ji (Saxo) at Linkbricks. It was fine-tuned using SFT, DPO, and MERGE techniques on a base model, incorporating 20 million Japanese news and Wiki corpus data. This model excels in high-dimensional analysis of customer reviews and social posts, coding, writing, mathematics, and logical reasoning, with cross-learning data across Japanese, Korean, Chinese, and English.
Loading preview...
Model Overview
Saxo/Linkbricks-Horizon-AI-Japanese-Pro-V5-70B is a 70 billion parameter language model developed by Mr. Yunsung Ji (Saxo), a data scientist at Linkbricks, an AI and big data analytics company. This model is an enhanced version, built upon the Saxo/Linkbricks-Horizon-AI-Japanese-Advanced-V4-70B base model, and has undergone Japanese SFT, DPO, and MERGE training using 8 H100-80G GPUs.
Key Capabilities
- Japanese Language Enhancement: Trained on 20 million Japanese news and Wiki corpus data.
- Multilingual Cross-Learning: Incorporates cross-training data for Japanese, Korean, Chinese, and English, enabling cross-fertilization processing.
- Advanced Reasoning: Specifically trained to handle complex logical and mathematical problems.
- Functional Support: Features Function Calling and Tool Calling capabilities.
- Extended Context Window: Supports a 128k-Context Window.
- Specialized Analysis: Enhanced for high-dimensional analysis of customer reviews and social posts.
- Coding and Writing: Improved performance in coding and writing tasks.
Training Details
The model utilizes Deepspeed Stage=3, rslora, and BAdam Layer Mode during its training process. The tokenizer uses the base model's configuration without word expansion. As of December 27, 2024, it holds a Rank-1 position on the Open Japanese LLM Leaderboard.