jasong03/qwen3-1.7b-bilingual-amr-sft-v1
TEXT GENERATIONConcurrency Cost:1Model Size:2BQuant:BF16Ctx Length:32kPublished:Feb 19, 2026License:apache-2.0Architecture:Transformer Open Weights Warm
The jasong03/qwen3-1.7b-bilingual-amr-sft-v1 model is a 1.7 billion parameter language model, fine-tuned from the Qwen3-1.7B architecture. This model is specifically fine-tuned for bilingual AMR (Abstract Meaning Representation) tasks, indicating its specialization in semantic parsing for multiple languages. It is designed for applications requiring advanced natural language understanding and semantic representation across different linguistic contexts. Its 32768 token context length supports processing longer inputs for complex semantic analysis.
Loading preview...