marzieh-maleki/llama323b-dnli-s2
TEXT GENERATIONConcurrency Cost:1Model Size:3.2BQuant:BF16Ctx Length:32kPublished:Mar 23, 2026Architecture:Transformer Warm

The marzieh-maleki/llama323b-dnli-s2 is a 3.2 billion parameter language model with a 32768 token context length. This model is a Hugging Face Transformers model, automatically pushed to the Hub. Further details regarding its architecture, training, and specific capabilities are not provided in the available model card.

Loading preview...