BryanSwk/LaserPipe-7B-SLERP
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Feb 7, 2024License:cc-by-nc-4.0Architecture:Transformer0.0K Open Weights Cold

BryanSwk/LaserPipe-7B-SLERP is a 7 billion parameter language model created by BryanSwk through a SLERP merge of OpenPipe/mistral-ft-optimized-1218 and macadeliccc/WestLake-7B-v2-laser-truthy-dpo. This model leverages the strengths of its constituent models, offering a combined performance profile for general language tasks. It is designed for experimentation with merged model architectures and provides a .gguf Q4_K_M for CPU inference.

Loading preview...