abacusai/Dracarys2-Llama-3.1-70B-Instruct
TEXT GENERATIONConcurrency Cost:4Model Size:70BQuant:FP8Ctx Length:32kLicense:llama3Architecture:Transformer0.0K Cold

Dracarys2-Llama-3.1-70B-Instruct is a 70 billion parameter instruction-tuned language model developed by Abacus.AI, fine-tuned from Meta-Llama-3.1-70B-Instruct with a 32K context length. This model is specifically optimized for coding performance, demonstrating improved scores on LiveCodeBench for code generation and test output prediction compared to its base model. It excels as a data science coding assistant, particularly for Python code generation using libraries like Pandas and NumPy.

Loading preview...