in

The falsehood in motion | Column TL; DR

The camera always lies: it distorts, frames, crops, focuses and highlights the author’s gaze. In a way, cinema has sought to perfect the art of deception: Einsentein lengthened the Odessa ladder with editing tricks to exalt tension and drama, while Chaplin made us believe we were inside a machine. We have also seen dinosaurs, robots from the future, and aliens.

Off-screen, photographs and videos are used as evidence of an objective reality: if a politician is recorded proclaiming offenses, a thief is caught in the act by a camera, or if we see a rocket launch into space, we accept these facts as true. We believe that audiovisuals are impartial and truthful.

Today, however, the boundaries between cinema and reality are blurred. What used to cost thousands of dollars in special effects, today is achieved with filters for Instagram and Facebook. The anchors to reality are cut little by little. What impact will this have on the trust we place in videos?

Just as Photoshop changed the ease with which photographs can be modified, today videos are manipulated by computers from thousands of pictorial moments collected from the Internet to create truths with lies. Deepfakes are videos created with specialized algorithms to exchange moving images flawlessly.

Although the technology is just beginning, some incredible examples already exist. On YouTube we can see Ron Swanson (Nick Offerman) superimposed on the body of Wednesday Addams (Christina Ricci) in a video of the user DrFakenstein or Tom Cruise’s face on that of comedian Bill Hader while imitating him in an interview with David Letterman in the Ctrl Shift Face account.

You might also be interested: What does deep fake technology mean for the future?

One of the simplest yet mind-blowing videos is that of Jim Meskimen reciting the poem “Pity the Poor Impressionist”. In an unparalleled exercise, the face of the imitator is exchanged for that of different actors while copying their mannerisms and way of speaking. The ease and fluidity with which the impressionist blends in is captivating. Undoubtedly, the future use of this technology will be wonderful: for example, in historical re-enactments or biopics it will no longer be necessary to find people similar to the original as the face could be superimposed on that of an actor.

However, deep fakes are also an unparalleled threat. In 2018, Jordan Peele (The Twilight Zone) played Barack Obama. Like a puppeteer, the actor controlled the image and the words of the former president at will. Similarly, in an alternate video of Channel 4’s 2020 Queen of England’s Christmas speech, we see the monarch dance for TikTok, speak ill of her country’s officials and criticize her own family.

The technology that allows actress Debra Stephenson to impersonate the queen and Peele the ex-president is becoming more common and, therefore, more dangerous. We already live in an age where the truth bows to the force of ideologies and the abundance of information.

With this new tool, any supporter will be able to create videos specifically to publicize their position or attack an opponent and the fakes news will be backed by deep fakes. Politicians will be able to deny any video of them under the excuse of being a manipulated video. The thieves may argue that it is not them, but an overlay of their face on a body. Videos of attacks on women and minorities, of human rights violations, and of social denunciation will lose their force.

Even on a personal level, if today we already see revenge by couples using intimate photographs, in the very near future it will be videos.

Today we laugh with the queen dancing and the impersonations of actors, but we continue to cut our ties with reality. We normalize this technology with humor, but great danger is just around the corner. The camera always lies and its level of disappointment increases.

Adán Lerma Adán Lerma is a screenwriter, doctor in Philosophy of science and professor of literature.

The post The falsehood in motion | Column TL; DR appeared first on Analogik.