viamr-project/qwen3-1.7b-amr-20260206-1235
Hugging Face
TEXT GENERATIONConcurrency Cost:1Model Size:2BQuant:BF16Ctx Length:32kPublished:Feb 6, 2026Architecture:Transformer Warm

The viamr-project/qwen3-1.7b-amr-20260206-1235 is a 2 billion parameter Qwen3-based model developed by viamr-project, specifically fine-tuned for Abstract Meaning Representation (AMR) parsing. Utilizing a Reinforcement Learning framework, this model excels at converting natural language sentences into their AMR graph representations. It features a 40960 token context length and demonstrates strong performance in AMR parsing tasks, achieving an F1 score of 71.31.

Loading preview...

Model Overview

The viamr-project/qwen3-1.7b-amr-20260206-1235 is a specialized 2 billion parameter model built on the Qwen3 architecture, developed by viamr-project. Its primary function is Abstract Meaning Representation (AMR) parsing, a task focused on converting natural language into a structured, graph-based semantic representation.

Key Capabilities

  • Abstract Meaning Representation (AMR) Parsing: The model is specifically trained to convert English sentences into their corresponding AMR graphs.
  • Reinforcement Learning (RL) Framework: Training was conducted using a veRL framework, indicating an optimization approach for sequence-to-sequence tasks like AMR parsing.
  • Performance Metrics: Achieves an F1 score of 71.31, with Precision at 72.74 and Recall at 69.94 on its internal benchmarks.
  • Context Length: Supports a substantial context length of 40960 tokens, which can be beneficial for processing longer sentences or complex linguistic structures during AMR parsing.

Use Cases

This model is particularly well-suited for applications requiring robust semantic understanding and structured representation of text, such as:

  • Natural Language Understanding (NLU): As a component for deeper semantic analysis in NLU systems.
  • Information Extraction: Extracting structured information from unstructured text by first converting it to AMR.
  • Machine Translation: Potentially aiding in meaning-preserving translation by operating on semantic representations.
  • Question Answering: Enhancing the understanding of complex questions by converting them into a formal semantic structure.