Here Are The Tools To Create Our Future Fake News

A protester carries a sign during the Women’s March in Washington on Jan. 21, 2017. (Joshua Lott/AFP/Getty Images)
 
At the heart of fake news — meaning deliberately misleading, untrue information presented as a news item — is a simple idea: People often want to believe things that aren’t true. Some people wantto believe that Barack Obama wasn’t born in the United States and, therefore, that his presidency was illegitimate. Some people want to believe that President Trump doesn’t know the words to the national anthem. It’s not just partisanship, of course. People constantly want to believe that celebrities did unbelievable things or that bizarre occurrences actually happened.
 
What shifted over the past few years is that the Internet made falsifying news items valuable in a way that it wasn’t before. There’s indirect value in sharing things that attract a lot of attention: likes, follows and so on on social media. But there’s also money to be made. A group of teenagers in Macedonia built a large network of sites spreading false information shortly before the 2016 election not because they had a dog in the fight but because saying outrageous things drove a lot of traffic to them — which in turn drove a lot of ad revenue.
 
It’s also a moment in which there’s direct political value in spreading misinformation. Setting aside the value that Trump himself sees in misrepresenting reality, there has been a deliberate effort to spread inaccurate information about political candidates and issues — Hillary Clinton, Brexit, etc. — to both shape public opinion and to make it harder to discern real from fake. If you see one article claiming that millions of people were caught voting illegally and another reporting (correctly) that there is no evidence at all of widespread voter fraud, the issue may seem as though it’s still being litigated.
 
Which brings us to porn.
There’s been a sudden interest in a new generation of tools that can be used to create misleading news stories, driven in part by users of the site Reddit who figured out a relatively easy way to superimpose celebrities’ faces onto videos of porn stars. Until last week, there was a community on Reddit called “deepfakes” that passed around not only modified porn but also examples of other video clips in which one person’s face was substituted for another’s.
 
Like swapping Nicholas Cage’s face in for Amy Adams singing “I Will Survive.”

,

,

How this could be used: Footage of an audience (say, at a State of the Union address) could be manipulated to make it seem as though a member of Congress were saying something that wasn’t actually said based on how their mouth is moving.
 
Face2Face is an early example of the ability to change a person’s appearance in a recorded video. In short, the process, introduced in 2016 by researchers Justus Thies, Michael Zollhöfer, Marc Stamminger, Christian Theobalt and Matthias Niessner, creates a sort of video marionette that can be made to present whatever facial expressions the user wants.
 
This has an obvious shortcoming: It doesn’t manipulate the audio of the original video.
 
Creating a fake video of a person speaking using his or her own voice

,

,

How this could be used: To create a controversial statement by a political candidate.
 
This is a remarkable video from Adobe Systems. Sentences can be rearranged to make it sound as though the source for a voice said something that was never said.
 
Think about a newly discovered recording of someone having used a racial slur back in college. The recording is scratchy and at times muffled. Would you feel confident that it was legitimate if you stumbled upon it online?
 
Now imagine that this technology is combined with the technology above: building out an artificial version of what someone said and where they said it.
 
Creating fake people
We live in a world, too, where random people can suddenly become important contributors to the national conversation, thanks to social media. The ability to generate realistic human actors from scratch creates an ability to create witnesses who don’t exist to events that didn’t happen.
 
Creating realistic photographs of nonexistent people

,

,

How this could be used: In an extreme example, imagine a military officer appearing to conduct an interview from within the Oval Office.
 
Part of the shift in the ability to create realistic forgeries of photos and videos is the expansion of access to powerful computers. Not only is there a technological question behind creating virtual environments and so on, but also the ability of a computer to draw those environments quickly has improved dramatically.
 
The video above, first posted to Facebook by Oscar Olarte Ruiz, shows the real-time addition of an actor to an existing virtual scene.
 
Changing the time of day or weather in a photo or video

,

 

Source: The Washington Post

more insights