vishnukv/WestSeverusJaskier
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 28, 2024License:mitArchitecture:Transformer0.0K Open Weights Cold
vishnukv/WestSeverusJaskier is a 7 billion parameter language model created by merging PetroGPT/WestSeverus-7B-DPO and bardsai/jaskier-7b-dpo-v6.1 using the SLERP method. This merged model demonstrates strong general reasoning capabilities, achieving an average score of 75.67 on the Open LLM Leaderboard across various benchmarks. It is suitable for tasks requiring robust language understanding and generation, particularly where a blend of its constituent models' strengths is beneficial.
Loading preview...