an article by Jacquelyn Burkell and Chandell Gosse (Western University Canada, London, Ontario) published in First Monday Volume 24 Number 12 (December 2019)
Abstract
In the last year and a half, deepfakes have garnered a lot of attention as the newest form of digital manipulation. While not problematic in and of itself, deepfake technology exists in a social environment rife with cybermisogyny, toxic-technocultures, and attitudes that devalue, objectify, and use women’s bodies against them.
The basic technology, which in fact embodies none of these characteristics, is deployed within this harmful environment to produce problematic outcomes, such as the creation of fake and non-consensual pornography.
The sophisticated technology and metaphysical nature of deepfakes as both real and not real (the body of one person, the face of another) makes them impervious to many technical, legal, and regulatory solutions. For these same reasons, defining the harm deepfakes causes to those targeted is similarly difficult and very often targets of deepfakes are not afforded the protection they require.
We argue that it is important to put an emphasis on the social and cultural attitudes that underscore the nefarious use of deepfakes and thus to adopt a more material-based approach, opposed to technological, to understanding the harm presented by deepfakes.
Full text (HTML)
Wednesday, 4 December 2019
Nothing new here: Emphasizing the social and cultural context of deepfakes
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment