Four ADAPT Research Papers Accepted at the 60th Global ACL Conference 2022

17 May 2022
ACL 2022

The 60th Annual Meeting of the Association for Computational Linguistics (ACL) has accepted four research papers in the Natural Language Processing (NLP) area affiliated with the ADAPT Centre. The ACL Conference is the leading conference in the NLP field and will host more than 1,500 researchers from across the world to present their ground-breaking work in Computational Linguistics. The conference will take place from the 22nd to the 27th of May, 2022 in Dublin.

Interest in NLP from industry and the scientific community is at an all time high as it is critical for the development of key speech and text technologies deployed in personal assistants, device interaction, web search, recommender systems, text editors, and dialogue systems, among many other applications. 

The ACL Annual Meeting is the largest and most highly ranked international conference for scientists solving computational problems involving human languages, and ADAPT researchers are delighted to present the following long  papers at the main ACL conference. 

Title: Quantified Reproducibility Assessment of NLP Results
Researchers: Anya Belz, Maja Popovic, and Simon Mille
PDF: https://aclanthology.org/2022.acl-long.2.pdf
This paper describes and tests a method for carrying out quantified reproducibility assessment (QRA) that is based on concepts and definitions from metrology. QRA produces a single score estimating the degree of reproducibility of a given system and evaluation measure, on the basis of the scores from, and differences be- tween, different reproductions. The authors test QRA on 18 system and evaluation measure combinations (involving diverse NLP tasks and types of evaluation), for each of which we have the original results and one to seven reproduction results. The proposed QRA method produces degree-of-reproducibility scores that are comparable across multiple reproductions not only of the same, but of different original studies. Their proposed method facilitates insights into causes of variation between reproductions, and allows conclusions to be drawn about what changes to system and/or evaluation design might lead to improved reproducibility.

Title: TwittIrish: A Universal Dependencies Treebank of Tweets in Modern Irish
Researchers: Lauren Cassidy, Teresa Lynn and Jennifer Foster
PDF: https://aclanthology.org/2022.acl-long.473.pdf
This research facilitates the development of language models and parsers for accurate automatic syntactic parsing of user-generated content. The paper presents TwittIrish, the first Irish Universal Dependencies Twitter Treebank. Modern Irish is a minority language that is on the verge of decline and endangerment. Even as Machine Learning finds applications in everyday life, it is important for us to develop technology that is able to translate the orthography, lexicon and syntax of such a language to the standard texts more commonly used in Natural Language Processing (NLP). For the above reason, this paper can prove to be impactful in the Computational Linguistics domain as well as for the Irish language/community as a whole. 

Title: Achieving Reliable Human Assessment of Open-Domain Dialogue Systems
Researchers: Tianbo Ji, Yvette Graham, Gareth J. F. Jones, Chenyang Lyu and Qun Liu
PDF: https://aclanthology.org/2022.acl-long.445.pdf
This paper deals with the evaluation of open-dialogue systems, an area which is known to be highly challenging in the scientific community. Open-domain dialogue systems are designed to initiate long-term and deep connections with users by meeting the human need for communication and social belonging. The automatic metrics currently used to carry out the evaluation of such open-dialogue systems are unable to recognise whether a conversation is high-quality or not, and hence are largely unreliable. This research presents successful human evaluation that is reliable, yet feasible and low cost. Research in the area of open-dialogue systems can prove to be crucial for the development of voice-related technology and other computational linguistics applications worldwide.

Title: Human Evaluation and Correlation with Automatic Metrics in Consultation Note Generation
Researchers: Francesco Moramarco, Alex Papadopoulos Korfiatis, Mark Perera, Damir Juric, Jack Flann, Ehud Reiter, Anya Belz, Aleksandar Savkov
PDF: https://aclanthology.org/2022.acl-long.394.pdf
In recent years, machine learning models have rapidly become better at generating clinical consultation notes; yet, there is little work on how to properly evaluate the generated consultation notes to understand the impact they may have on both the clinician using them and the patient’s clinical safety. To address this the authors  present an extensive human evaluation study of consultation notes where 5 clinicians (i) listen to 57 mock consultations, (ii) write their own notes, (iii) post-edit a number of automatically generated notes, and (iv) extract all the errors, both quantitative and qualitative. We then carry out a correlation study with 18 automatic quality metrics and the human judgements. They find  that a simple, character-based Levenshtein distance metric performs on par if not better than common model-based metrics like BertScore. All findings and annotations are open-sourced.

In addition to the research papers accepted to the ACL Conference, two of ADAPT’s Principal Investigators, Professors John D. Kelleher (TU Dublin) and Andy Way (DCU), are also the Local Organisation Chairs. Both of their works are widely recognised in the areas of Natural Language Processing, Deep Learning and Machine Translation. 

Furthermore, ADAPT’s Education and Public Engagement (EPE) team will have a presence at ACL in the form of an All Ireland Linguistics Olympiad (AILO) stand presented by Dr Cara Greene, highlighting AILO and International Linguistics Olympiad (IOL) alumni now working in NLP related areas.

Whether it is on the organisational or the research front, the ADAPT Centre is delighted to support the 60th ACL Conference this year.  

More about the conference on the ACL Website.