nbeerbower/bruphin-theta
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kPublished:Mar 10, 2024License:apache-2.0Architecture:Transformer Open Weights Cold
nbeerbower/bruphin-theta is a 7 billion parameter language model created by nbeerbower, resulting from a SLERP merge of Weyaxi/Einstein-v4-7B and nbeerbower/bruphin-eta. This model combines the characteristics of its constituent models, offering a blended performance profile for general language tasks. Its 4096-token context length supports moderate input sequences, making it suitable for applications requiring a balance of capacity and efficiency.
Loading preview...