OpenPipe/mistral-ft-optimized-1227 is a 7 billion parameter language model developed by OpenPipe, based on a hierarchical SLERP merge of several Mistral-7B fine-tunes including OpenHermes-2.5, Neural-Chat-7B-v3-3, MetaMath-Mistral-7B, and OpenChat-3.5-1210. This model is designed as a strong base for downstream fine-tuning across various tasks, offering a robust foundation for specialized applications. It supports an 8192-token context length, making it suitable for tasks requiring moderate input and output lengths.
No reviews yet. Be the first to review!