Menu

Cross-lingual Bootstrapping for Semantic Role Labeling

calendar icon Jan 11, 2013 3254 views
split view icon
video icon
presentation icon
video with chapters icon
video thumbnail
Pause
Mute
speed icon
speed icon
0.25
0.5
0.75
1
1.25
1.5
1.75
2

The approach we present uses semantic similarity between parallel sentences to bootstrap semantic role labeling (SRL) models for a pair of languages. The setting is similar to co-training, except for the intermediate model required to convert the SRL structure between the two annotation schemes used for different languages. This approach can facilitate the construction of SRL models for a resource-poor language, while preserving the annotation schemes designed for it and leveraging the resources available for this language. It can also be extended to benefit from the use of the resources in multiple languages simultaneously. We evaluate the model on four language pairs, English vs German, Spanish, Czech and Chinese, against a supervised baseline, and discuss the improvements observed, as well as the factors that affect the performance of the model.

RELATED CATEGORIES

MORE VIDEOS FROM THE SAME CATEGORIES

Except where otherwise noted, content on this site is licensed under a Creative Commons Attribution-NonCommercial-NoDerivs 4.0 International license.