Location: Trinity College Dublin
Level: Primary Degree

We are no longer accepting applications for this position

Post Status: Specific Purpose Contract – Full-time (14 months if started 1st Dec 2020)

Research Group / Department / School: Sigmedia Research Group, ADAPT Centre, School of Engineering, Trinity College Dublin

Reports to:Principle Investigator, Prof. Naomi Harte

Salary: Between €23,061-€34,930 per annum (depending on experience)

Hours of Work:37 hours per week (full time)

Whilst the role can be started remotely in Dec 2020 if practical, the ideal candidate will be in Dublin, Ireland by Jan 2021 to allow working in our Lab in TCD. Note that the PI is open to discuss work practices that include a mixture of lab/home working.

Closing Date:12 Noon (GMT), 16th November (or until filled)

Post Summary

The Science Foundation Ireland ADAPT Research Centre (adaptcentre.ie), seeks to appoint a Research Assistant in Speech Interaction.

The successful candidate will support research in online interaction in teaching scenarios, in the context of the recently funded SFI Covid 19 Rapid Response Project, RoomReader led by Prof. Naomi Harte in TCD and Prof. Ben Cown in UCD. The candidate will be working with a team to drive research into multimodal cues of engagement in online teaching scenarios. The work involves a collaboration with Microsoft Research Cambridge, and Microsoft Ireland.

The candidate should have experience in speech based interaction. The candidate will also be responsible for supporting research in a number of areas including:

  • Design of an online scenario to elicit online teaching interactions
  • Dataset capture including candidate recruitment, hardware set-up and co- ordination
  • Dataset labelling, curation and sharing for future research dissemination
  • Supporting an ADAPT Citizens think-in around AI for Education
  • Supporting user trials and interviews to support Teacher- and Student-Centred Interface Design

Thus, the ideal candidate will typically have specific expertise in speech interaction, dataset capture and design. Reporting to a Principal Investigator, the successful candidate will work within a larger group of Postdoctoral Researchers, PhD students and Software Developers. They will have exposure to all aspects of project lifecycle, from requirements analysis to design, code, test and face-to-face demonstrations, including with our industry partners Microsoft Research and Microsoft Ireland.

The successful candidate will work alongside the best and brightest talent in speech and language technologies, and video processing in the Sigmedia Research Group on a day-to-day basis. The wider ADAPT Research centre will give exposure to a wider range of technologies including data analytics, adaptivity, personalisation, interoperability, translation, localisation and information retrieval. As a university-based research centre, ADAPT also strongly supports continuous professional development and education. In this role you will develop as an researcher, both technically and scientifically. In addition, ADAPT will support candidates to enhance their confidence, leadership skills and communication abilities.

Standard Duties and Responsibilities of the Post

  • Identify and analyse research papers in data set design and collection to elicit a range of interactions typical of teaching scenarios
  • Analyse state of the art in understanding of online human interaction scenarios, specifically those relevant to online teaching
  • Merge findings from the literature with feedback form user interviews
  • Liaise with engineering and HCI experts to refine and influence approaches to the project at all levels
  • Compile applications for ethical approval and adhere to ethical standards in all data collection efforts
  • Report regularly to the PI of the project, and interact regularly with other team members to maintain momentum in the project
  • Recruit subjects for dataset capture
  • Dataset recording and subsequent editing and labelling for project deployment
  • Contribute to project publications

Funding Information

The position is funded through the SFI COVID-19 Research Call 2020.

Person Specification

The successful candidate will have broad experience in speech-based interaction. We are looking for someone who can span the technical and human aspects of this project, to design a standard online interaction task (akin to MapTask or Diapix), and to build a new dataset that will allow the analysis of multimodal cues of engagement in online teaching scenarios, both for this project and the wider research community. The successful candidate is expected to:

  • Have a thorough understanding of speech based interaction, including linguistic, verbal, non-verbal and visual cues
  • Be skilled at taking disparate research ideas and draw innovative conclusions or see new solutions
  • Have excellent interpersonal skills
  • Be highly organised in their work, with an ability to work remotely if necessary


  • Candidates appointed to this role must have a primary degree or (ideally) a postgraduate qualification (e.g. Masters) in Linguistics, Psychology, Engineering or Computer Science, or a related field

Knowledge & Experience (Essential & Desirable)


  • Understanding of multimodal cues in speech based interaction
  • Familiarity with MS Teams environment
  • Use of standard tools for editing and labelling a dataset, e.g. Adobe Suite, ffmpeg, Elan, Anvil, Praat etc


  • Dataset collection and release experience
  • Basic knowledge of python or scripting language, or willingness to learn as necessary

Skills & Competencies

  • Excellent written and oral proficiency in English (essential)
  • Good communication and interpersonal skills both written and verbal
  • Proven aptitude for automating work flow through use of coding
  • Proven ability to prioritise workload and work to exacting deadlines
  • Flexible and adaptable in responding to stakeholder needs
  • Enthusiastic and structured approach to research and development
  • Excellent problem-solving abilities
  • Desire to learn about new products, technologies and keep abreast of new product technical and research developments


  • Competitive salary and equity
  • Computer and peripherals of your choice
  • A fast-paced environment with impactful work
  • Pension
  • Day Nursery
  • Travel Pass Scheme
  • Bike to Work Scheme
  • Employee Assistance Programme
  • Sports Facilities
  • 22 days of Annual Leave
  • Paid Sick Leave
  • Training & Development
  • Staff Discounts

Sigmedia Research Group

The Signal Processing and Media Applications (aka Sigmedia) Group was founded in 1998 in Trinity College Dublin. Originally with a focus on video and image processing, the group today spans research in areas across all aspects of media – video, images, speech and audio. Prof. Naomi Harte leads the Sigmedia research endeavours in human speech communication. The group has active research in audio-visual speech recognition, evaluation of speech synthesis, multimodal cues in human conversation, and birdsong analysis. The group is interested in all aspects of human interaction, centred on speech. Much of our work is underpinned by signal processing and machine learning, but we also have researchers grounded in linguistic and psychology aspects of speech processing to keep us grounded.


Background on ADAPT

The ADAPT Centre, a world-leading SFI Centre, is Ireland’s global centre of excellence for

digital content technology funded through Science Foundation Ireland’s Centres programme. ADAPT combines the expertise of over 300 researchers across eight Higher-Education Institutes (Trinity College Dublin, Dublin City University, University College Dublin, Technological University Dublin, Cork institute of Technology, Athlone Institute of Technology, Maynooth University and National University of Ireland, Galway) with that of its industry partners to produce ground-breaking digital content innovations.  The ADAPT Centre executive function is co-hosted between Trinity College Dublin and Dublin City University. ADAPT brings together more than 300 researchers who collectively have won more than €100m in funding and have a strong track record of transferring world-leading research and innovations to more than 140 companies. ADAPT partners are successfully advancing the frontiers of Artificial Intelligence (AI), content analysis, machine translation, personalisation, e-learning/education, media technologies, virtual and augmented reality, and spoken interaction, as well as driving global standards in content technologies.