Rigged Game đŸ“±


Erin Gallagher

A hashtag analysis by multimedia artist Erin Gallagher.

Brad Parscale, the former digital director of the Trump campaign, was recently asked about his retweet of @TEN_GOP, a Twitter account that appeared to be the Republican party of Tennessee but turned out to be run by a famed Russian troll factory.

“Yes, I feel bad that it was a — it was not a Tennessee account — that I got fooled that it was a Tennessee GOP account,” Parscale said.

Here’s the message he retweeted: “Thousands of deplorables chanting to the media: ‘Tell The Truth!’ RT if you are also done w/ biased Media!”

Forgive Parscale for not seeing the Russian hand behind the account. Many news outlets and others amplified that account’s messages, sometimes in support of them, or to debunk them, or to cite them as reflective of the views of some Americans.

The last point is one of the most important to reckon with. That account, as well as the many others identified by congressional investigations on Twitter, Facebook, and Instagram, blended in with the flow of content from Trump supporters. The accounts didn’t set the agenda, but rather amplified and mimicked what was already being shared, stacking more wood atop an existing bonfire of partisanship and social division. And it wasn’t just pro-Trump content.

For example, can you tell which of these Bernie Sanders posts is from a Russian troll page, and which is American?

Or how about the similarity of these posts about Bill Clinton?

The Russian effort exploited one of the great promises of social platforms — a level playing field — to blend in with other content being pushed out during and after the election. Russian propaganda mixed with an avalanche of hyperpartisan political content, which itself inspired fabricated news stories from fake news publishers, which were in turn copied and pushed out by hundreds of young Macedonian spammers. These messages, stories, and memes traveled in the very same containers and pathways as their legitimate counterparts, across platforms like Twitter and Facebook.

These platforms blur the lines between people, entities, and types of content. Accounts can be people or companies or governments. Multiple Facebook pages or Twitter accounts can be run by the same people, but you’d never know to look at them. A tweet or Facebook post can be turned into an ad, which can then accrue additional reach thanks to people engaging with it in a genuine way. Everyone is here and anyone can be anything! Fittingly, it brings to mind the title of Peter Pomerantsev’s book: “Nothing Is True and Everything Is Possible: The Surreal Heart of the New Russia.”

It’s not just about real vs not real. It’s about a “flat space” where art/people/cities/businesses all same account
 https://t.co/hqF1Ic1aKm

If you can’t tell whether a Facebook or Twitter account is run by an American, a Macedonian spammer, or a Russian troll, then that’s great news for the Macedonians and Russians, or others seeking to push false information on social media. The fact that so much attention could be harvested on social media by fostering division, confusion, and conflict speaks volumes about American politics and society — but also about these massive platforms that cloak themselves in the values and talk of liberal democracies.

One of the unintended consequences of the so-called “flattening” effect of platforms is that, by ostensibly putting everyone on the same level, you empower those who become experts at gaming the system. By democratizing media on platforms that reward pure attention capture, you enable manipulation on a profound scale.

Thanks to the internet, the marketplace of ideas is more open and more democratized than ever before. Yet thanks to social platforms, it’s also been rigged to reward those who can manipulate human emotion and cognition to trigger the almighty algorithms that pick winners and losers.

“Whatever piece of content, however brilliant or vile, that received an escalating chain reaction of user engagement would receive instantaneous, worldwide distribution,” wrote former Facebook manager Antonio García Martínez.

Of course, the previous system wasn’t perfect, either. Prior to the internet, and especially before social platforms, the media was dominated by large entities who operated massive production and distribution systems that were required to gain reach: satellites, transmitters, printing presses, etc. A relatively small number of people and companies dictated the news and information available.

These entities still have advantages on social platforms: They can more easily attract followers thanks to an established brand, and they’re often given a leg up in the form of verified accounts and partnerships with the platforms themselves. But in the end, they still have to go toe-to-toe with the Macedonian publishers who don’t care if a story is true, just that it performs well on Facebook. They have to compete with Russian information operations that have budget to spend, dedicated trolls working around-the-clock shifts (just like in a newsroom), and social divisions to mine.

Layer in algorithmic filtering that promotes the content that generates the most engagement and you have what New York Times media writer John Herrman refers to as “The Big Huge Black Box Attention Market.”

A shift in the market

But now, as a result of the effectiveness of these very same trolls and spammers, we are seeing the early stages of a major shift in the attention market away from the level playing field concept platforms have always espoused. Instead of leaving it to opaque algorithms to determine what gets more attention, Facebook, Google, and Twitter are now more publicly putting their thumbs on the scale.

Facebook, Twitter, and Google are highlighting “trust symbols” in news articles to signal to readers which outlets may be more worth your attention. This follows other attempts to show more contextual information for links, and to flag and algorithmically downgrade content that is deemed false by fact checkers. Twitter is rebooting its account verification program, and in the process taking the check mark away from white nationalists, while warning that more action is to come. (Which naturally means more questions about its actions. For example, Twitter recently removed the check mark from an NBA player’s account.)

Google is going to be “carefully curating” the results that show up in its “Top Stories” section after misinformation repeatedly made its way into those coveted spots during recent breaking news events. Google’s YouTube is also cleaning up its search and recommendation results for news events after similar failures.


Facebook

An example of contextual information about a news story that is now being shown by Facebook.

Farhad Manjoo of the New York Times is one of many voices encouraging platforms to unlevel the playing field.

“It’s time for Twitter to scrap one of its founding principles: the idea that it is an anything-goes paradise, where anyone who signs up for a voice on its platform is immediately and automatically given equal footing with everyone else, and where even the vilest, most hateful and antisocial behavior should be tolerated,” he wrote.

Rather than presenting themselves as a place where all things are equal and anyone can be anything, the platforms are now overtly promoting and removing content and giving and revoking privileges in order to tell you what should and shouldn’t earn your attention. They’re unapologetically unflattening their products to reduce the incredible potential for manipulation.

This new approach, if it continues, will likely benefit media incumbents, those with real-life influence, and perhaps those publishing more mainstream content. (Mainstream according to whom? The platforms!) Companies who work closely with these platforms are also more likely to enjoy the benefits of this new approach. (BuzzFeed News, for example, makes a live morning show for Twitter.) It hopefully means less misinformation and malicious bots, but it could also mean an even harder slog for the credible long tail of websites that lack the connections or scale to receive anointed status or a platform push.

This is the beginning of the end of the pure view of social networks, of the way these companies have talked about their services since the very beginning. But it more accurately means an end to a false promise, an insidious misperception.

The platforms have always been in charge and have always been picking winners and losers. They built systems that prioritized certain things over others and that provided the framework for actors of all types. They chose to have algorithms enforce their rules and presented this as an impartial process. It never was.

During his recent interview, Parscale said he wouldn’t have retweeted that Twitter account if it had displayed a Russian flag or IP address. He was, in a way, endorsing the idea that accounts and content on platforms should come with more context and information to enable more informed decisions. This is the kind of contextual information that Facebook is testing right now. It hopes these new changes will make it easier to stop bad actors in the act or to enable users to make better decisions. (They also hope this new approach will help avoid regulation.)

How this plays out, and how far Facebook and others will go to clearly privilege certain actors and content, remains to be seen. The only certainty is that if this new approach becomes permanent, and if it plays an important role in the content the algorithms promote, the attempts at manipulation will move in that direction. The people whose livelihoods depend on capturing attention will adapt and find ways to exploit The New Big Huge Black Box Attention Market.

This new attention market will still be rigged — we just don’t know exactly how yet.

Got a confidential tip? Submit it here.

Let’s block ads! (Why?)

BuzzFeed – Latest

Post Author: martin

Martin is an enthusiastic programmer, a webdeveloper and a young entrepreneur. He is intereted into computers for a long time. In the age of 10 he has programmed his first website and since then he has been working on web technologies until now. He is the Founder and Editor-in-Chief of BriefNews.eu and PCHealthBoost.info Online Magazines. His colleagues appreciate him as a passionate workhorse, a fan of new technologies, an eternal optimist and a dreamer, but especially the soul of the team for whom he can do anything in the world.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.