nbeerbower/bruphin-lambda
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 30, 2024License:apache-2.0Architecture:Transformer Open Weights Cold
The nbeerbower/bruphin-lambda is a 7 billion parameter language model created by nbeerbower, resulting from a SLERP merge of chihoonlee10/T3Q-Mistral-Orca-Math-DPO and nbeerbower/bruphin-kappa. This model leverages the strengths of its merged components, particularly benefiting from the math-focused DPO fine-tuning of T3Q-Mistral-Orca-Math-DPO. With a 4096-token context length, it is designed for tasks requiring robust language understanding and potentially enhanced mathematical reasoning capabilities.
Loading preview...