marzieh-maleki/llama323b-dnli-s1
TEXT GENERATIONConcurrency Cost:1Model Size:3.2BQuant:BF16Ctx Length:32kPublished:Mar 23, 2026Architecture:Transformer Warm

The marzieh-maleki/llama323b-dnli-s1 is a 3.2 billion parameter language model with a 32768 token context length. Developed by marzieh-maleki, this model is presented as a Hugging Face Transformers model. Due to limited information in its model card, its specific architecture, training details, and primary differentiators are not explicitly stated. Further details are needed to determine its optimal use cases or unique strengths compared to other LLMs.

Loading preview...