formulae/7B-Dorflan
TEXT GENERATIONConcurrency Cost:1Model Size:7BQuant:FP8Ctx Length:4kLicense:mitArchitecture:Transformer0.0K Open Weights Cold

formulae/7B-Dorflan is an experimental 7 billion parameter merged language model created by formulae, combining stabilityai/StableBeluga-7B, ehartford/dolphin-llama2-7b, and AIDC-ai-business/Marcoroni-7B. This model was developed using a custom merging technique without further fine-tuning, primarily intended for testing and research purposes. It exhibits an average performance of 58.19 across various benchmarks, including ARC, HellaSwag, MMLU, and TruthfulQA, making it suitable for exploring the effects of model merging.

Loading preview...