akhil-dua/llama-3.2-1b-redteam_ift
TEXT GENERATIONConcurrency Cost:1Model Size:1BQuant:BF16Ctx Length:32kPublished:Jan 19, 2026Architecture:Transformer Warm

The akhil-dua/llama-3.2-1b-redteam_ift is a 1 billion parameter language model with a 32768 token context length. This model is a fine-tuned variant, likely optimized for specific red-teaming or safety-related tasks, given its name. Its compact size makes it suitable for applications requiring efficient inference while maintaining a substantial context window for complex interactions. Further details on its specific training and capabilities are not provided in the available documentation.

Loading preview...