RatanRohith/NeuralPizza-WestSeverus-7B-Merge-slerp
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Jan 25, 2024License:apache-2.0Architecture:Transformer Open Weights Cold

RatanRohith/NeuralPizza-WestSeverus-7B-Merge-slerp is a 7 billion parameter language model created by RatanRohith, resulting from a slerp merge of NeuralPizza-7B-V0.1 and PetroGPT/WestSeverus-7B-DPO. This model leverages the strengths of its constituent models, offering a balanced performance profile for general language tasks. Its 4096-token context window supports moderate-length interactions and text generation.

Loading preview...