Download PDFOpen PDF in browser

Can the Rich Help the Poor? Transfer of Knowledge and Resources for Under Resourced Languages Semantic Role Labeling

EasyChair Preprint 10380

11 pagesDate: June 11, 2023

Abstract

Semantic Role Labeling (SRL) is a crucial natural language processing task that involves assigning semantic roles to words or phrases in a sentence. While SRL techniques have been extensively studied for well-resourced languages, under-resourced languages face significant challenges due to the lack of annotated data and language-specific resources. In this paper, we explore the potential of leveraging knowledge and resources from richly-resourced languages/domains to improve SRL performance in under-resourced languages. We provide an overview of the SRL process, discuss the challenges faced in under-resourced languages, and present techniques for transferring knowledge and resources from rich languages/domains. By utilizing resources from rich languages, we can overcome some of the challenges in SRL for under-resourced languages. The advantages include access to pre-trained models, lexical resources, and annotated data, which can improve SRL accuracy. However, there are also challenges such as language divergence and domain mismatch that need to be addressed. We discuss these challenges and propose possible solutions, including domain adaptation and data augmentation techniques. Finally, we conclude by emphasizing the importance of leveraging the resources of rich languages to advance SRL in under-resourced languages.

Keyphrases: Domain Adaptation, Transfer Learning, multi-task learning, under-resourced languages

BibTeX entry
BibTeX does not have the right entry for preprints. This is a hack for producing the correct reference:
@booklet{EasyChair:10380,
  author    = {Yesuf Mohamed and Wolfgang Menzel},
  title     = {Can the Rich Help the Poor? Transfer of Knowledge and Resources for Under Resourced Languages Semantic Role Labeling},
  howpublished = {EasyChair Preprint 10380},
  year      = {EasyChair, 2023}}
Download PDFOpen PDF in browser