djuna/MN-Chinofun-12B-2
TEXT GENERATIONConcurrency Cost:1Model Size:12BQuant:FP8Ctx Length:32kPublished:Oct 23, 2024Architecture:Transformer0.0K Cold
The djuna/MN-Chinofun-12B-2 is a 12 billion parameter language model created by djuna using the Model Stock merge method, based on ArliAI/Mistral-Nemo-12B-ArliAI-RPMax-v1.2. This merged model integrates capabilities from several pre-trained models, offering a 32768 token context length. It is designed to combine diverse strengths from its constituent models, making it suitable for general language generation tasks.
Loading preview...