NemoReRemix-12B by MarinaraSpaghetti is a 12 billion parameter language model, merged from several pre-trained models including Mistral-Nemo-Instruct and Migtissera's Tess-3-Mistral-Nemo. Optimized for storytelling and roleplay, it also functions effectively as a general assistant model. This model improves prose quality and formatting consistency compared to its predecessors, offering enhanced intelligence for conversational applications with a 32768 token context length.
No reviews yet. Be the first to review!