Location: Dublin City University
Research Career Framework
As part of this role the researchers will be required to participate in the DCU Research Career Framework http://dcu.ie/hr/ResearchersFramework/index.shtml This framework is designed to provide significant professional development opportunities to Researchers and offer the best opportunities in terms of a wider career path.
Background & Role
The ADAPT Centre data governance researchers develop new methods to maintain and manage AI analytics, data quality, interoperability and privacy for enterprise, open and government data. The ADAPT Centre is seeking a postdoctoral researcher for a one-year fixed-term contract to develop and validate new analytics, metrics and tools to support automated data governance and AI governance.
The successful candidate will be deployed in the risk, quality and value research team that aims to integrate semantic governance models with predictive analytics for data asset analysis, organisational monitoring and dependable AI. This research is carried out in collaboration with a world-leading data governance platform provider.
This project includes working a multi-disciplinary team based in computer science, psychology and business informatics. Our quantitative approach to AI and data governance draws on a metrics-based analysis of datasets, people and systems. Many aspects of data quality, value and risk have yet to be quantified as insightful, predictive or descriptive analytics and this is a key challenge to automating data governance for the ever larger, federated, dynamic data assets needed for digital transformation.
The key tasks are to:
- Develop new automated or semi-automated, metrics-based methods and tools to quantify data value and data quality in a data governance system.
- Work with industrial partners to develop use cases, requirements and demonstrator metrics-driven data governance systems.
- Publish the results in leading international journals and conference venues to establish thought-leadership in this rapidly expanding area.
- Collaborate with knowledge engineers, organisational psychologists and business experts to develop new governance frameworks based on a combination of metrics, knowledge graphs and organisational models.
Principal Duties and Responsibilities
The successful candidate will work with a multinational, multidisciplinary team of researchers. The work of this Postdoctoral researcher will focus on developing new metrics that give insight into data value and quality that can be published in a linked data format based on the W3C Data Cube specification. This will include working with large datasets to define appropriate descriptive, diagnostic, predictive and prescriptive analytics for data value and quality.
Reporting to the SFI Funded Investigator, the Postdoctoral Researcher will be responsible for:
- The research and implementation of technical solutions to the project goals.
- Produce top-quality journal and conference publications, in collaboration with other project members.
- Identify and write proposals for additional research funding.
- Participate in project activities, such as meetings, reviews, demonstrations and other events.
- Provide support and advice to collaborators and partners working on the same project.
- Contribute to teaching and supervision in the School of Computing.
- Engage in the dissemination of the results of the research.
- Engage in appropriate training and development opportunities as required by the Project, the School or Research Centre, or the University.
- Liaise with both internal and external stakeholders including industry and academic partners/collaborators.
- Carry out administrative work associated with the programme of research as necessary.
Applicants should have a PhD in in Computer Science, Mathematics, Statistics, Data Science or a related discipline.
In addition, it is desirable that a candidate has skills in:
- Demonstrated experience of contributing to research projects in the data analytics, data science field.
- Experience in developing scalable and robust software solutions for working with large- scale datasets.
- Strong technical skills, including at least two of:
- Analytics and machine learning platforms e.g. H2O or TensorFlow
- Application of NLP techniques including Named Entity Recognition and Entity Linking to extract and represent information.
- Databases: MySQL, MariaDB, Mongo, Apache Jena triplestore or other Database Technologies.
- A strong publication record of international peer-reviewed publication and presentation in top-tier conferences and journals.
- Excellent research skills with experience, in one or more of:
- Data Analytics
- Scaling and Evaluating Data Management Solutions
- Data Governance
Informal Queries to: Paul Keegan, Research Accountant - email@example.com Please include the ADAPT Position Title in all email communications.
Further details and application procedure can be found here