Chronos-Platinum-72B Overview
ZeusLabs/Chronos-Platinum-72B is a 72.7 billion parameter large language model built upon the Qwen 2.5 base architecture. It has undergone two epochs of fine-tuning using the proprietary Chronos Divergence dataset, specifically optimized for conversational and creative text generation. The model supports an extensive context length of 131072 tokens, making it suitable for complex and lengthy interactions.
Key Capabilities
- Roleplaying and Storywriting: Demonstrates strong performance in generating engaging and coherent narratives for roleplaying scenarios and creative story development.
- General Assistant Tasks: Capable of handling a wide range of general conversational and informational assistant queries.
- ChatML Integration: Designed to work seamlessly with the ChatML instruct template, ensuring compatibility with many popular frontends.
Training and Optimization
The model's training involved a combination of cleaned and de-slopped logs from provided sources and WizardLM evol data. Synthetic and partially synthetic data, generated using models from Anthropic and OpenAI, were also utilized to enhance its capabilities. The developers recommend specific sampling settings (Temp: 0.7-1.2, Min P: 0.025-0.05, Presence Penalty: 1.0, Repetition Penalty range: 4000) for optimal performance, noting that system prompts significantly influence output quality.