RJuro/munin-neuralbeagle-7b
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 17, 2024License:cc-by-nc-4.0Architecture:Transformer0.0K Open Weights Cold
RJuro/munin-neuralbeagle-7b is a 7 billion parameter language model created by RJuro, based on a merge of danish-foundation-models/munin-7b-alpha and mlabonne/NeuralBeagle14-7B. This model utilizes the DARE TIES merge method and is specifically optimized for Mainland Scandinavian Natural Language Generation (NLG) tasks. It currently ranks highly on the Mainland Scandinavian NLG leaderboard, demonstrating strong performance in this specialized domain.
Loading preview...