Russian Disinformation Redux?:

A new “deep fake” technology tried to topple Biden’s bid for the Presidency. But who was responsible?

The blessing of blessings is to beat the other man’s army without getting into the fight yourself.  – The Art of War

It looked so promising at first. After months of sexual assault smears that refused to gain traction against Democratic Presidential candidate Joe Biden, the right-wing blogosphere finally found a scandal that might stick. For the true believers from QAnon to Newsmax, the sixty-five page document contained incontrovertible “proof” that Hunter Biden was a corrupt child of privilege, trading on his father’s name and doing dirty work for Communist China, which also ostensibly made Biden’s son a security risk.

The author of the document was Martin Aspen, a Swiss intelligence professional at a firm called Typhoon Investigations. With a graying beard and ruddy European features, Aspen’s Twitter photo showed the face of an approachable shop teacher or a retired park ranger. The document that bore his name made the rounds on the right-wing websites for a few months but when it finally made its way to the mainstream media, it wasn’t for the expected reasons. 

Elise Thomas is an investigator with the Australian Strategic Policy Institute, an organization that specializes in cyber-warfare and technology. Curious about Martin Aspen, she made some inquiries with employers that he had listed on his LinkedIn page. Those employers had never heard of him. In fact, she found, nobody by that name lived in Switzerland. When she pulled up Aspen’s Twitter page and profile photo, Thomas immediately noticed something off about his eyes – an odd doubling of the pupils. Thomas dug deeper. Further investigation of his social media profile revealed that his Facebook account was created only months before the article that had been published on Intelligence Quarterly. A reverse-image search on Google revealed that the pictures of his home that appeared on his Twitter account were lifted from TripAdvisor.

It quickly became clear who the purported author of this document was – or rather, who it wasn’t. Everything about Aspen –his resume, his photo, his entire identity– were fiction. Martin Aspen, it turned out, was human disinformation. Other anomalies confirmed what she suspected: Martin Aspen’s face was not that of a real person. Current software and even some websites can create realistic faces from nothing, but the technology is just short of being perfect. Someone had created Aspen to give the article about Hunter Biden a patina of credibility, to make it seem as if it were written by an experienced analyst at a credible security firm. But who?

Investigators turned to the source – the person who sent the document to the website where it was first published. In September, almost one month after Martin Aspen’s account was created on Facebook, someone named Christopher Balding sent the document to Albert Marko, the registered owner of the anonymous blog The Intelligence Quarterly

Balding admitted to creating Martin Aspen. Balding told NBC that he had been “handed” the document over the summer but would not say by whom. “I want to strongly emphasize I did not write the report but I know who did,” Balding mysteriously wrote in an email to NBC. Later, he had a change of heart and confessed to writing “some of” the dossier. Balding explains that when he sent the document to Marko, he created Aspen’s persona due to concerns about “legitimacy”. “Due to the understandable worry about foreign disinformation, it was paramount that the report document activity from acknowledged and public sources.” The obvious answer to legitimacy concerns is…to create an artificial person. Once he’d done that, Balding would eventually go on to plug the document on far-right sites such as WorldNetDaily. From there it cross-pollinated into social media and once Newt Gingrich tweeted it to his 2.6 million followers, the disinformation job was complete. 

Well, maybe not quite. Two months later, Martin Aspen’s cover would be blown. The story of Hunter Biden’s work in China would also limp into the public eye, but not from Balding’s “legitimate” dossier. On October 14, an article in the New York Post with the headline “Smoking-gun email reveals how Hunter Biden introduced Ukrainian businessman” appeared. Confusing details abounded: something about a laptop that belonged to the former Vice-President’s son that had been abandoned at a computer store and worried the owner enough to contact authorities. Though the claims of Biden’s corruption were similar, the chain of evidence that led to the discovery of the laptop would prove even less credible than the sixty-four-page document created by a fictional intelligence analyst. The owner of the store was so “alarmed” by the contents…that he naturally called Rudolph Giuliani, someone who intelligence agencies had cited as a launderer of Russian disinformation. Politically the whole thing fizzled, but not before stories about it appeared in the Wall Street Journal and the New York Times. If this was indeed a Russian disinformation campaign, then it was a successful one. But was it actually disinformation? And what qualifies as disinformation anyway? 

Martin Aspen

What is disinformation

According to French journalist Jean Francois Revels, “disinformation is the art of getting your enemies to say what you want them to say.” At its most rudimentary, it is way for one group to control the conversation of another group. It’s also a form of psychological “soft warfare” that can operate on its own or can be used to enhance traditional warfare. It creates division and confusion for another party by anonymously manipulating the information that they are consuming, such as social media or news. The victims are flooded with confusing data. When used by one country on another, information warfare “mostly uses words and images to persuade, inform, mislead, and deceive so that the adversary does not use the (fully operational) military assets it does have, and the military outcome is the same as if those military assets had been destroyed.” (Lin and Kerr). Whether it’s used by a foreign adversary, a political party, or a group of activists, the ultimate goal of disinformation is to change the minds of the targets without appearing to be doing so. Disinformation, therefore, is extremely difficult to trace.

An example of domestic disinformation would be 2016’s “Pizzagate”, when a group of pro-Trump consipiracy theorists began to flood social media with posts about the DNC being a front for a pedophile ring. Trump’s campaign benefited from the false claims (and in the case of General Flynn’s son, boosted them). 

The era of the Internet has seen misinformation take advantage of new avenues like social media and electronic advertising. In fact, by 2017, bots – the nickname for automated online participants – were responsible for approximately half of all web traffic (Zeifman). Ironically this reach has also given disinformation some degree of instability. The same problems that make disinformation difficult to track make its reach difficult to measure. How many re-tweets of a piece of false information are read? Are real accounts tweeting them or bots? Who exactly is being affected? In this way, disinformation is messy and has a Frankenstein effect: once it is created, it can no longer be controlled. None of this is considered a disadvantage, as it’s unpredictability functions to further obscure the perpetrator.

When disinformation targets a specific political party or government (often in the form of leaks, fraudulent and otherwise), trust is be eroded in the institution themselves. This can “cultivate a fatigue among the population” (Nyheter). In turn, this makes it more challenging for the targeted governments to create solutions, both for policy reasons and because a population can be wary of the implementation of those solutions. This is compounded by the fact that disinformation campaigns are often stoke populist suspicion of government functions. 

Information laundering

Information laundering is strategy for disseminating disinformation, where data (leaked documents, false claims, forged photos) makes its way from a questionable source to the mainstream news. Often a document appears on a website anonymously or written by someone posing as a professional. Fringe groups will seize on the content and circulate it, and it will eventually make its way up the chain to more credible outlets. Along the way, sites who publish it profit from the clicks, whether they’re refuting the claims or supporting them. This laundering information takes advantage of a phenomenal called “cognitive hacking” (exploiting psychological vulnerabilities like gullibility) as well as media sensationalism. Whether it’s Russian conspiracy theories about the coronavirus or a celebrity gossip, the United States contains an ideal media ecosystem to launder information. Ryan Holliday details this in Trust Me, I’m Lying: Confessions of a Media Manipulator

A classic example of information laundering was Operation Infektion. In 1983, the Indian newspaper Patriot, a New Delhi publication set up exclusively to disseminate Russian disinformation, published an article that quoted a “well-known American scientist” who claimed that HIV was genetically engineered in a lab in the US and was being tested on the Pakistan. Radio Moscow accompanied the article with claims that American scientists were intentionally spreading the disease to Africans and that the CIA was distributing “AIDS-oiled condoms”. The Patriot article was reprinted by Soviet news publications worldwide. It then jumped to non-Soviet outlets and eventually made its way to a Peter Jennings broadcast of the ABC nightly news. As recently as 1992, 15% of Americans considered it definitely true that “”the AIDS virus was created deliberately in a government laboratory”. By 2005, 50% of African-Americans polled believe AIDS was man-made. 

Similar claims about the coronavirus abound on Russian propaganda sites now. There are dozens of these sites, but a select few seem to be responsible for generating most of the material. 

Russian Disinformation Sites

Sophia Mangal. Milko Oejoic. Mehmet Ersoy. All of these authors have been published by the website Global Research, a Canadian-based, English language Russian disinformation site. They also share something with Martin Aspen – they aren’t real. They are fake Facebook accounts, created by the GRU.

The articles sourced to these people come from a handful of Russian disinformation sites that have been identified by the U.S. intelligence and other disinformation watchdogs, such as NewsGuard. Among these sites are South FrontNewsFront, and The Strategic Culture Foundation. All of these sites share content and authors, frequently cross-fertilizing material and passing it off as original. Global Research, an English language sister-site based in Canada, has published over 1200 pieces that previously appeared in South Front. The content invariably coincides with Kremlin talking points and is frequently uploaded anonymously. Western authors with names like Christopher Black and James O’Neill give these sites an air of credibility, though the topics are consistently pro-Russia. For example, in 2015, O’Neill wrote an article for NEO parroting the Kremlin’s denial that a Russian missile was responsible for the downing of Malaysian Airlines Flight 17.

Articles then make their way to the US via channels like the RT Twitter account, Facebook, and Global Research. One article in SouthFront – “US Hospitals Getting Paid More to Label Cause of Death as ‘Coronavirus’” – was quoted by Donald Trump. Another – ““Former Putin’s aide: Coronavirus is the US biological weapon?” recalls the AIDS article in the Patriot

The sites go to extraordinary (but imperfect) lengths to obscure their ties to one another and their connections to the Kremlin. Tellingly, though, they often leave digital footprints that reveal these connections, either in the metadata of the web pages or in the reprinting of state documents word for word. 

When these connections are revealed by investigators, the site’s cover is effectively blown. If the resulting publicity is bad enough, this can result in a complete ban from social media. This was the case with NewsFront. A Georgia civil society group named International Society for Fair Elections and Democracy (ISFED) found NewsFrontstealing user identities on Facebook and spreading disruptive messaging aimed at interfering with the Georgian election. Other social media companies followed: YouTube, where NewsFront had 484,000 followers and had generated almost 5 million views, removed their account in May of 2020 and Twitter quickly followed suit. At this point, the exposed site will either transfer the content of the “revealed” site to one of the sister sites or rebrand it an open up a new website under a new name.

Similar banning occurred after Russia’s involvement in the 2016 election. According to the Mueller report, Russia began gathering information from US social media, specifically focusing on politics and other divisive topics such as religion and immigration. The Kremlin-linked Internet Research Agency analyzed the various metrics of social media: engagement (how people interact with one another, and how often), reach (the distance a particular message spread), and content (the actual messages themselves). 

Using a combination of stolen identities, fake identities, and automated identities (bots), Russia infiltrated social media platforms such as Facebook and Twitter, swamping it with content pushed from the disinformation websites mentioned above. These accounts gathered hundreds of thousands of authentic followers, allowing Russia to shape, if not control, the narratives about which it was posting. In Countering Information Influence Activities: The State of the Art, writes that “polarization was a key stratagem, on the grounds that sowing confusion and uncertainty was more effective than supporting a single candidate or specific ideology.” (p81.)

Russian controlled accounts began organizing pro- and anti-Trump rallies, moving from online influence to real-world influence. Further, this activated actual Americans to (unwittingly) spread Russian disinformation to human beings outside of the social media sphere. It did all of this with a budget of roughly $2 million dollars[1] and a team of roughly one hundred people. 

Was Martin Aspen a product of the Kremlin? And were the Hunter Biden emails a Russian disinformation campaign? Until we have more information, it’s hard to tell. At this time, there is no definitive proof that Biden laptop came from Russia, but its existence fits a pattern and meets the Russia (and Trump’s) objective of discrediting Biden. An email dump is almost exactly what they did to Clinton in 2016. Further, Russia has run cyber-war operations against the Ukraine, which seems like a far likelier way for them to obtain emails than an unclaimed laptop at a computer store. And Giuliani’s involvement only makes the Russia connection more likely. Whistleblowers have repeatedly fingered the former mayor for being a “central figure” in the effort to solicit information on Biden’s son from the Ukraine. Now he is under investigation for being a target of Russian intelligence. This doesn’t prove with any certainty that the emails were part of a Russian campaign, although over fifty senior intelligence agents have signed a letter that they believe it is. What is certain is that Russian disinformation will continue to affect democracies all over the world – the price of free speech is being vulnerable to its manipulation by outsiders. Even if they don’t technically exist.

[1] As far as the price of political influence goes, political advertisers on Facebook spent over $264 million in the third quarter of 2020. Both campaigns total spending had reached over $11 billion as of October 2020.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: