ADAPT is pushing the boundaries of human speech and gesture recognition to increase the accuracy of robotic interpretation.
We develop smarter search and retrieval technologies that help deliver the most relevant digital content to the user.
State-of-the-art techniques allow users gain an accurate overview of ongoing public events and facilitate novel audience segmentation.
Working with ADAPT - Industry Partners Have Their Say
Digital media or content that is stored in digital formats, is enabling unprecedented communication across distances, languages and devices, and creating business value for partners.1
Create opportunities for innovation, growth and better healthcare by connected health systems and the data to help health professionals and citizens receive better information on medical conditions and treatments.2
Optimise time and potential by exploring solutions and developing new business models that can transform businesses by unlocking the knowledge in unstructured digital data.3
ADAPT is supported by the Irish government through a Science Foundation Ireland investment of €24 million. This funding is leveraged with an additional €26 million from industry partners.
Unlock the Potential of Collaborative FinTech Research
Ethical and privacy issues need to be to the forefront during technology development and ADAPT helps industry and individuals understand, manage and control overwhelming amounts of content.
Improving global discoverability through better website keywords.
ADAPT has significant commercial expertise having engaged with more than 140 companies in Ireland and beyond, ranging from indigenous start-ups to multinational enterprises. With a dedicated business and commercial development team, ADAPT has a strong focus on understanding and addressing technology and business challenges in order to deliver solutions tailored for industry collaborators.
Responding to industry needs, ADAPT has developed an extensive range of commercialisation-ready technologies. These cutting edge innovations have been developed based on industry needs.
Professor Vincent Wade is Director of the ADAPT Centre for Digital Content Technology. His research focuses on knowledge engineering for adaptive (web) systems research and personalisation and is being successfully applied in three application areas, namely Telecommunications & Service Management, adaptive eLearning and Web applications. He was awarded Fellowship of Trinity College for his contribution to research and has published over one hundred and fifty scientific papers in international journals and conferences of repute. Vincent also holds the position as Visiting Scientist at IBM’s Centre for Advanced Studies in Ireland (2006 to date).
Professor Andy Way’s research interests include all areas of machine translation, which he has applied to a career that has spanned academia and industry. As head of the Transforming Global Content theme, Professor Way focuses on achieving translation of controlled quality, handling different levels of noise across multiple language pairs and domains and optimally leveraging human quality interactions for effective and authentic communication across language and cultural barriers.
Liam is responsible for directing ADAPT’s commercialisation and business development activities and is the main contact for all industry engagement, both new and existing. Liam provides leadership for the strategic development, implementation and continuous review of the ADAPT Centre commercialisation strategy. He has over 25 years experience across Microsoft product development, internationalisation, and sales and marketing.
Naomi is Associate Professor in Digital Media Systems at Trinity and her specialist area is Human Speech Communication. She has worked in high-tech start-ups in the field of Digital Signal Processing (DSP) Systems Development, including her own company founder in 2002. She leads the Interacting with Global Content research platform that aims to support the adaptation of conversational content to the user's cognitive state to maintain an engaging multimodal dialogue regardless of the physical context.