arcee-ai/Virtuoso-Small-v2 is a 14.8 billion parameter language model distilled from Deepseek-v3, built upon the Qwen-2.5-14B architecture. This model leverages logit-level distillation and "fusion merging" to transfer advanced reasoning capabilities. It excels in technical and scientific queries, complex code generation, and mathematical problem-solving, with a context length of 128k tokens.
No reviews yet. Be the first to review!