Duxiaoman-DI/XuanYuan2-70B-Chat
TEXT GENERATIONConcurrency Cost:4Model Size:70BQuant:FP8Ctx Length:8kPublished:Feb 4, 2024License:llama2Architecture:Transformer0.0K Open Weights Cold

The XuanYuan2-70B-Chat model, developed by Duxiaoman-DI, is a 70 billion parameter instruction-tuned language model. It is a successor to XuanYuan-70B, enhanced through continued pre-training with high-quality data, instruction fine-tuning, and reinforcement learning with human feedback (RLHF). This model significantly improves general capabilities, safety, and financial domain performance, and supports an extended context length of 16k tokens.

Loading preview...