Samsung Makes Deepfaking Easier

deep fake samsung

Researchers at the Samsung Group have developed a technology that will make it even easier to produce deepfake porn in the future. The company is aware of the dangers the technology poses, but points to the inevitability of technological progress, following the principle: »If we don’t do it, someone else will.«

The Arvix research laboratory, which belongs to the empire of the South Korean Samsung conglomerate, has developed an algorithm that can animate faces to seemingly realistic movements using a single photo. The more photos available, the more realistic the videos turn out of course, but a single photo – for example, a profile picture from Facebook or from a company’s intranet page – will be enough to create a life-like video, making it possible to create credible deepfake pornography of a person.

Deepfake pornos are videos in which the image of a person is implemented into an existing porn scene, giving the impression that the person was actually involved in the film. Until recently, it took at least a two-digit number of different photos of a person to create a realistic-looking scene of a few seconds. So far, fake pornos have mainly affected celebrities, of whom there is naturally a lot of photo and film footage showing their faces in motion. But even then the danger was mentioned that private persons and especially victims of vengeful ex-partners could get into difficult situations through deepfake porn.

As a result, several streaming providers banned deep-fake porn from their offerings, so that the community had to move to smaller sites and Reddit-like portals.

Now, apparently, a single image is enough to animate a face on film. As an example, Arvix published an animated picture of the Mona Lisa. In connection with the presentation, the researchers published a statement that suggests that Samsung is well aware of the dangers that the simplified technology can pose in the hands of users: »We realize that our technology can have a negative use for the so-called ‘deepfake’ videos. However, it is important to realize that Hollywood has been making fake videos (aka ‘special effects’) for a century, and deep networks with similar capabilities have been available for the past several years. Our work (and quite a few parallel works) will lead to the democratization of the certain special effects technologies.«

In addition to the danger of people involuntarily becoming porn stars, the technology is also likely to have problematic consequences in the field of political propaganda. Fake news, conspiracy sites and targeted voter manipulation have been preoccupying Western democracies ever since Donald J. Trump’s election, in which Russia apparently meddled massively. With the new technology, it will become even easier to create videos of unwanted politicians and share them through social media shortly before elections.

Just recently, Trump published a video by his Democratic opponent Nancy Pelosi that was manipulated to give the impression that Pelosi was drunk. It quickly became clear that the video was being manipulated. With more and more clever algorithms easily available, which on top of that work more and more perfectly and are harder to expose, fake news could spread even more powerfully than it has done so far.

Here is the Arvix presentation:

 

NO COMMENTS

LEAVE A REPLY

Please enter your comment!
Please enter your name here