alnrg2arg/blockchainlabs_test3_seminar
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 2, 2024License:apache-2.0Architecture:Transformer Open Weights Cold
alnrg2arg/blockchainlabs_test3_seminar is a 7 billion parameter language model created by alnrg2arg, formed by merging FelixChao/WestSeverus-7B-DPO-v2 and macadeliccc/WestLake-7B-v2-laser-truthy-dpo using the slerp merge method. This model leverages the strengths of its constituent DPO-tuned models, offering a combined capability for general language tasks. It is designed for applications requiring a robust 7B parameter model with a 4096-token context length.
Loading preview...