grimjim/Nemo-Instruct-2407-MPOA-v4-12B
TEXT GENERATIONConcurrency Cost:1Model Size:12BQuant:FP8Ctx Length:32kLicense:apache-2.0Architecture:Transformer0.0K Open Weights Cold

Nemo-Instruct-2407-MPOA-v4-12B is a 12 billion parameter instruction-tuned model from grimjim, featuring Magnitude-Preserving Orthogonalized Ablation (MPOA) applied to specific layers. This model is optimized for varied text completion, demonstrating coherent English text generation. It is designed with a focus on balancing safety refusals, making it suitable for diverse generative tasks.

Loading preview...