alnrg2arg/blockchainlabs_7B_merged_test2_4
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 17, 2024License:cc-by-nc-4.0Architecture:Transformer0.0K Open Weights Cold

The blockchainlabs_7B_merged_test2_4 model is a 7 billion parameter language model created by alnrg2arg, formed by merging mlabonne/NeuralBeagle14-7B and udkai/Turdus. This model leverages a slerp merge method, combining the strengths of its base models to offer a versatile language processing capability. It is designed for general-purpose applications, providing a balanced performance across various natural language tasks with a 4096-token context length.

Loading preview...