The wanlige/li-14b-v0.4-slerp0.1 model is a 14.8 billion parameter language model created by wanlige, resulting from a SLERP merge of wanlige/li-14b-v0.4 and sthenno-com/miscii-14b-0218. It features a 32768 token context length and is designed to combine the strengths of its constituent models. This merged model is suitable for general language understanding and generation tasks, aiming for improved performance across various benchmarks.
No reviews yet. Be the first to review!