NewstaR/7B-Orfini
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kLicense:mitArchitecture:Transformer Open Weights Cold

NewstaR/7B-Orfini is an experimental 7 billion parameter language model created by NewstaR through a custom merge of StableBeluga-7B, orca_mini_v3_7b, and Marcoroni-7B. This model is intended for testing and research purposes to evaluate the outcomes of merging diverse foundation models without further fine-tuning. Its primary characteristic is its experimental nature, designed to explore the stability and performance implications of architectural and weight integration.

Loading preview...