Model Overview
Vortex5/MN-12B-Azure-Veil is a 12 billion parameter language model developed by Vortex5. It is constructed using a passthrough merge method via mergekit, combining specific layer ranges from four distinct pre-trained models to integrate their strengths.
Key Merge Details
This model is a composite of:
anthracite-org/magnum-v4-12b(layers 0-15)SicariusSicariiStuff/Impish_Nemo_12B(layers 15-20)crestf411/MN-Slush(layers 20-32)Vortex5/Moonlit-Shadow-12B(layers 32-40)
This layered merging approach allows for the selective integration of features from each base model, aiming to create a versatile and robust language model. The tokenizer is sourced from anthracite-org/magnum-v4-12b, and the model uses bfloat16 for its data type. With a context length of 32768 tokens, MN-12B-Azure-Veil is suitable for applications requiring processing of longer inputs and generating comprehensive outputs, benefiting from the combined knowledge and capabilities of its merged predecessors.