Grade: Grade 6: £31,502-£37,386
The Oxford Internet Institute is a world-leading department at the University of Oxford, conducting cutting-edge multi-disciplinary research on societal challenges and opportunities in the face of rapidly advancing digital technological change. Our goal is rigorous intellectual innovation that can address public problems.
We are seeking a research assistant to join the research group led by Dr Luc Rocher, programme director, DPhil in Social Data Science. The research assistant will join a year-long project on the societal impact of privacy technologies, working in close collaboration with Dr Rocher. The main involvement will be developing and implementing new computational methods to audit modern privacy-enhancing technologies.
This position is particularly suited for pre-doctoral candidates to build research skills and collaborate on research projects before applying for doctoral studies, although the skills and experience learned will be helpful to a range of positions in and outside academia.
The role is full time 37.5 hours per week (part-time hours will be considered), available from 1st September 2023 for a duration of 8.5 months. This position is based in the UK and offers the opportunity for hybrid working; the group are open to discussing a working pattern that works for the successful candidate.
The role involves research on the impact of modern privacy technologies such as differential privacy and synthetic data generation. The post holder will be expected to:
- Manage their own research and administrative activities,
- Contribute to wider project planning,
- Select, follow, implement, and adapt specialist methodologies from machine learning, privacy, and security and
- Contribute to the design of research materials and scientific articles
The successful candidate will hold or be close to completion of an MSc degree in a relevant field such as computer science, engineering, statistics, or mathematics and have excellent communication skills.
Full details of the requirements for this post can be found in the Job Description link below.
If you would like to discuss this role, please contact Dr Luc Rocher firstname.lastname@example.org.
You will be required to upload a supporting statement, a CV, and details of two referees as part of your online application. The closing date for applications is 12.00 midday on 19 June 2023.
Committed to equality and valuing diversity.
How to apply
- Opening date: Wednesday 24 May, 2023, 9:00am UTC
- Closing date: Monday 19 June, 2023, 12:00pm UTC
- Interview date: Interviews are scheduled for the week beginning Monday 3 July, 2023
Only applications received before 12:00pm UTC on Monday 19 June 2023 will be considered.
Applications are invited from highly-qualified individuals for the position of Research Assistant to collaborate with Professor Brent Mittelstadt (Associate Professor), Professor Sandra Wachter (Professor of Technology and Regulation), and Dr Chris Russell (external collaborator) on the interpretability, transparency, and fairness in machine learning and AI.
This is a full-time position, available immediately for a fixed-term duration until 31 December 2024, subject to funding availability. This position is based in the UK and offers the opportunity for hybrid working, and the team are open to discussing a working pattern that works for the successful candidate.
The post will be part of the ‘Trustworthiness auditing in AI’ project, which will research and develop tools and tests to evaluate the efficacy of AI accountability tools. The project involves intensive multidisciplinary collaboration across machine learning, philosophy, clinical and social psychology, and law.
The post-holder will collaborate with Professor Brent Mittelstadt, Professor Sandra Wachter, Dr Chris Russell, Professor Netta Weinstein, and other project researchers from law, philosophy, and psychology to undertake case studies in medicine and open science, and develop a meta-toolkit for trustworthy AI. With Dr Chris Russell, the post-holder will undertake research and develop methods to qualitatively and quantitatively evaluate different types of explanations of model outputs in meeting the needs of different audiences.
The position is suited to candidates who hold a postgraduate degree in a relevant discipline such as computer science, data science, statistics, mathematics, or others and have an interest in ethical and legal aspects of machine learning and AI (e.g. interpretability, fairness, bias, discrimination).
The successful candidate will have the ability to provide research support for a work package focused on technical aspects of interpretability and fairness in machine learning, and experience of publishing results in peer-reviewed international academic journals and conferences.
Excellent technical, analytic, and writing skills are essential. Expertise in machine learning, computer vision, human-computer interaction, or similar relevant areas, and prior experience with contributing to research publications in peer-reviewed academic journals and conferences are highly desirable.
Full details of the duties and the requirements for this post can be found in the Job Description link below.
Further queries can be directed to Professor Brent Mittelstadt email@example.com.
You will be required to upload a supporting statement, a CV, and details of two referees as part of your online application. The closing date for applications is 12.00 midday on 2 June 2023.
Committed to equality and valuing diversity.
How to apply
- Opening date: Friday 26 May, 2023, 15:48pm UTC
- Closing date: Friday 2 June, 2023, 12:00pm UTC
Only applications received before 12:00pm UTC on Friday 2 June 2023 will be considered.