mylesgoose/Llama-3.2-1B-Instruct-abliterated
TEXT GENERATIONConcurrency Cost:1Model Size:1BQuant:BF16Ctx Length:32kPublished:Oct 1, 2024License:llama3.2Architecture:Transformer0.0K Warm

mylesgoose/Llama-3.2-1B-Instruct-abliterated is a 1.23 billion parameter instruction-tuned causal language model developed by Meta, part of the Llama 3.2 collection. Optimized for multilingual dialogue use cases, including agentic retrieval and summarization, it supports a 128k context length and was trained on up to 9 trillion tokens. This model excels in assistant-like chat applications across multiple languages, outperforming many open-source and closed chat models on common benchmarks.

Loading preview...