Please use this identifier to cite or link to this item:
https://www.um.edu.mt/library/oar/handle/123456789/141835| Title: | The challenges of deepfake technology on the decision-making processes within law enforcement : a study within the EU landscape |
| Authors: | Buckingham, Elton (2025) |
| Keywords: | Deepfakes -- Malta Disinformation -- Prevention Artificial intelligence -- Law and legislation -- Malta Artificial intelligence -- Law and legislation -- Europan Union countries Chain of custody (Evidence) -- Malta Law Enforcement -- Decision-Making Metadata -- Malta Law enforcement -- Malta Law enforcement -- European Union countries |
| Issue Date: | 2025 |
| Citation: | Buckingham, E. (2025). The challenges of deepfake technology on the decision-making processes within law enforcement : a study within the EU landscape (Master’s dissertation). |
| Abstract: | This study aims to scrutinize the challenges posed by deepfake technology in the decision-making processes of law enforcement, both locally and across Europe, categorizing them into three key areas: Knowledge and Awareness: Assessing law enforcement officers’ familiarity with deepfake technologies, associated crimes, and the operational challenges encountered. Evidence Management: Identifying difficulties in detecting deepfake content, preserving its evidential integrity, and ensuring its admissibility. Legal and Procedural Challenges: Analyzing legislative frameworks and courtroom practices regarding deepfake evidence, and current initiatives to mitigate misuse. A qualitative research approach was employed, utilizing a structured questionnaire distributed through intermediaries to law enforcement agencies across the EU, supplemented by secondary data and case studies to enhance validity through triangulation. Twelve officers responded (4 from Malta and 8 from other EU countries), all with direct or indirect experience of deepfakes in criminal investigations. There was a broad consensus among respondents on a significant lack of preparedness in addressing deepfake threats, both in terms of detection capabilities and evidentiary handling. Officers reported substantial difficulties in reliably identifying manipulated media, preserving its chain of custody, and navigating inconsistent legal standards across jurisdictions. Detection challenges are compounded by rapid advances in synthetic media quality, often outpacing available forensic tools. In contrast to findings from EU respondents, studies from the United States and Australia show a slightly higher level of operational preparedness, often attributed to earlier adoption of digital forensic techniques and a more rapid development of AI-focused legal scholarship. However, even in these jurisdictions, significant concerns remain about evidentiary reliability and procedural fairness when presenting AI-generated content. Deepfake evidence presents unique admissibility challenges, such as establishing authenticity, reliability, and the absence of tampering. Courts are struggling to develop consistent standards, and current digital evidence frameworks often lack explicit provisions for synthetic media. There is a growing debate over whether new evidentiary rules are needed or if existing frameworks (such as chain of custody, expert testimony, and metadata analysis) can adapt adequately. On the ground, officers report that gaps in regulation and procedural clarity lead to uncertainties in evidence collection and presentation, delaying investigations or causing reliance on expert witnesses to establish basic authenticity. Prosecutors express concern over juror perceptions of manipulated media and the risk of undermining trust in legitimate evidence. These issues often translate into higher costs, longer case preparation times, and difficulties in achieving convictions. |
| Description: | M.A.(Melit.) |
| URI: | https://www.um.edu.mt/library/oar/handle/123456789/141835 |
| Appears in Collections: | Dissertations - FacEma - 2025 Dissertations - FacEMAMAn - 2025 |
Files in This Item:
| File | Description | Size | Format | |
|---|---|---|---|---|
| 2519EMAEMA590705033202_1.pdf | 2.64 MB | Adobe PDF | View/Open |
Items in OAR@UM are protected by copyright, with all rights reserved, unless otherwise indicated.
