Kquant03/Samlagast-7B-bf16
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 9, 2024License:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Kquant03/Samlagast-7B-bf16 is a 7 billion parameter language model created by Kquant03, merged using the task arithmetic method. This model combines several pre-trained language models, including NeuralOmniWestBeaglake-7B, Faraday-7B, and MBX-7B-v3, with NeuralOmniBeagleMBX-v3-7B as its base. It is designed to explore the outcomes of merging diverse language models, offering a unique blend of their underlying capabilities.

Loading preview...