Lab meeting - Sivan Milton and Nathan Schucher
At this week’s lab meeting, Sivan Milton and Nathan Schucher will each give a talk.
- Tuesday, March 22, 15:00–16:00 (Montréal time, UTC-5).
- Meetings are via Zoom. If you would like to attend the talk but have not yet signed up for the MCQLL meetings this semester, please register here.
Sivan Milton
Title: Crowdsourcing Semantic Role Labels in a Cross-lingual Context
Abstract:
Semantic Role Labelling (SRL), the process of annotating predicates, arguments, and their thematic roles, has significant applications in NLP. However, such annotations can be highly expensive to produce, and are typically focused on English. We propose a crowdsourcing approach which does not require linguist annotators and can be applied to many languages. We use the contextualized question generator from Pyatkin et al. 2021 to generate naturalized questions which are accessible to non-linguist crowdworkers, targeting arguments for a given predicate in a translated text. We then align crowdsourced English responses to their correspondents in the source language. This approach allows for the cross-lingual annotation of datasets with English semantic role labels.
Nathan Schucher
Abstract
Prompt tuning has recently emerged as an effective method for adapting pre-trained language models to a number of language understanding and generation tasks. In this paper, we investigate prompt tuning for semantic parsing—the task of mapping natural language utterances onto formal meaning representations. On the low-resource splits of Overnight and TOPv2, we find that a prompt tuned T5-xl significantly outperforms its fine-tuned counterpart, as well as strong GPT-3 and BART baselines. We also conduct ablation studies across different model scales and target representations, finding that, with increasing model scale, prompt tuned T5 models improve at generating target representations that are far from the pre-training distribution.