Loading…
Loading grant details…
| Funder | UK Research and Innovation Future Leaders Fellowship |
|---|---|
| Recipient Organization | Lancaster University |
| Country | United Kingdom |
| Start Date | Nov 14, 2024 |
| End Date | Nov 13, 2028 |
| Duration | 1,460 days |
| Number of Grantees | 1 |
| Roles | Fellow |
| Data Source | UKRI Gateway to Research |
| Grant ID | MR/Y018397/1 |
Novel deep fake technology poses a serious and imminent threat to society and work is urgently needed to better protect ordinary people. Deep fakes (also termed synthetic media) refer to audio, image, text, or video that has been automatically synthesised by a machine learning system. Although such technological advances can have impressive and entertaining applications, they are already being weaponised for the purposes of image-based sexual abuse, financial fraud, and amplifying disinformation campaigns.
Over 90% of deep fake videos are non-consensual porn and, of those, 99% feature women, yet most research focuses on technical approaches for protecting celebrities and world leaders. In my FLF I will examine how psychological science can help find ways to detect deep fakes and protect ordinary people from the harms that deep fake technologies present.
I will work with partners including the police, the public, government, and technology experts to co-design and develop an innovative forensically assured verification system to detect deep fake pornography. This system will be built using state-of-the-art facial recognition technology and ongoing partner consultation will ensure development of a practically useful, usable, trusted, and sustainable system that protects ordinary people.
In this ever-evolving space, my work will also examine emerging threats from the newest wave of deep fakes and seek to appreciate the effectiveness of currently available protective tools. In this FLF I will adopt an interdisciplinary approach to undertake six interrelated research and dissemination work package (WP)s:
WP1 draws on state-of-the-art facial recognition software to develop a forensically assured verification system (FAVS) to address the challenge of deep fake pornography. The system will be developed and refined as a proof of concept using non-sexual material before being applied to the detection of deep fake pornography (refinements also based on relevant learnings from other WPs). Accuracy across different sociodemographic groups will be examined to check and improve algorithmic fairness.
WP2 combines methodologies and theory from psychology and computer science to examine 1) the realism of the latest deep fake media, 2) modality-based individual differences in detection ability by comparing typical, early-blind, and early-deaf individuals' detection of image, audio, video deep fakes, and 3) how neuropsychological theory unpinning the results can be used to inform and create sociotechnical tools to tackle the threats from deep fakes.
WP3 uses interviews and focus groups with the police, public, and tech experts to gain in-depth understanding of the current and emerging threats from deep fake technology, victim reporting and police response, and to allow co-design and development of FAVS. The findings will feed into the development of FAVS (WP1) to ensure the system is useful, usable, trusted, and sustainable.
WP4 is a collaboration with Google to examine the effectiveness of their latest tools aiming to protect ordinary people from visual misinformation. Drawing on psychological theory and experimental methods I will analyse their approach and recommend ways to improve their tools.
WP5 explores optimal aftercare provision for victim-survivors of online image-based sexual abuse through interviews with victim-survivors who have publicly spoken about being targets of online sexual abuse. I will also talk to senior representatives from relevant Violence Against Women and Girls (VAWG) charities. WP5 will also include a systematic review of the literature on 'what works' in the provision of aftercare and justice for victim-survivors of image-based sexual abuse. The findings will feed into the development of FAVS (WP1) to ensure responsible innovation.
WP6 focuses on career development, knowledge sharing, and impact, ensuring clear scientific advances, strong practical impacts, and the legacy of the research.
Lancaster University
Complete our application form to express your interest and we'll guide you through the process.
Apply for This Grant