abacusai/Dracarys-72B-Instruct
TEXT GENERATIONConcurrency Cost:4Model Size:72.7BQuant:FP8Ctx Length:32kPublished:Aug 14, 2024License:tongyi-qianwenArchitecture:Transformer0.0K Warm
Dracarys-72B-Instruct is a 72.7 billion parameter instruction-tuned causal language model developed by Abacus.AI, finetuned from Qwen2-72B-Instruct. This model specializes in coding performance, demonstrating improved scores on LiveCodeBench for code generation and execution compared to its base model. With a context length of 131072 tokens, it is optimized for data science coding assistance and generating Python code.
Loading preview...
Popular Sampler Settings
Top 3 parameter combinations used by Featherless users for this model. Click a tab to see each config.
temperature
top_p
top_k
frequency_penalty
presence_penalty
repetition_penalty
min_p