alnrg2arg/blockchainlabs_joe_bez_seminar
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 5, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

The alnrg2arg/blockchainlabs_joe_bez_seminar is a 7 billion parameter language model created by alnrg2arg, formed by merging flemmingmiguel/MBX-7B-v3 and vanillaOVO/supermario_v4 using the Slerp method. This model leverages a 4096-token context length and combines the strengths of its constituent models. It is designed for general language tasks, integrating diverse capabilities from its merged components.

Loading preview...