baseten/Llama-3.2-3B-Instruct-pythonic is a 3.2 billion parameter instruction-tuned causal language model from the Meta Llama 3.2 family, optimized for multilingual dialogue use cases. This model excels at agentic retrieval and summarization tasks, outperforming many open-source and closed chat models on common industry benchmarks. It supports English, German, French, Italian, Portuguese, Hindi, Spanish, and Thai, and utilizes an optimized transformer architecture with Grouped-Query Attention for improved inference scalability.
No reviews yet. Be the first to review!