Deepfakes are algorithmically modified video and audio recordings that project one person’s appearance on to that of another, creating an apparent recording of an event that never took place. Many scholars and journalists have begun attending to the political risks of deepfake deception. Here we investigate other ways in which deepfakes have the potential to cause deeper harms than have been appreciated. First, we consider a form of objectification that occurs in deepfaked ‘frankenporn’ that digitally fuses the parts of different women to create pliable characters incapable of giving consent to their depiction. Next, we develop the idea of ‘illocutionary wronging’, in which an individual is forced to engage in speech acts they would prefer ...