Wednesday, August 5, 2015

How Facebook Will Probably Manipulate the Next US Election


It's a well accepted fact that money heavily influences US politics, with the candidate who spends the most money usually winning the presidential election. And it's expected that the 2016 US presidential election will involve the most campaign spending in history. But what is often overlooked is that it's not money in itself that wins elections. You can't directly buy votes. So why does the correlation exist?

Money buys attention. At the end of the day, every voter can vote for whomever they want, but campaign spending determines how much they get saturated with information and propaganda for various candidates. Money makes a candidate a household name, pays for billboards and banners, buys attack ads to discredit candidates, and so on. Money doesn't buy elections, it's just a means to influence what voters know, which then affects how they vote.

News media has known this for years, of course, where most readers/viewers understand the political leanings of their source, and aren't surprised to see biased reporting. But what happens when people think they're getting unfiltered information? What if you can make people think they're getting a balanced view, and that view gives the distinct impression that one candidate is better than the others?

This is where social media is the big game in town. People don't think of their Facebook news feed, their Twitter feed, or their Google searches as being politically biased, or the tools of a private company to manipulate what they see and therefore influence what they think. They tend to think of these as neutral sources, possibly biased towards showing them things that are likely to keep them clicking, or advertising for things that they are likely to want. But they don't think of social media companies as manipulating what they think and feel and understand about the world.

With the amount of time people spend on social media now, it would be pretty bad if those privately held profit-driven companies were to take advantage of their completely legal power to manipulate what they show their users, right?

Various surveys have made clear the increasing role Facebook is playing as a news source for many people. More and more people are relying on the news articles shared by friends and the "related stories" links there as a main source of news. So if the items that appeared in their feeds were filtered in order to favour articles that are positive to some candidate or party and negative to others, then these people would be having their opinions manipulated to some degree, and would end up more likely to vote a certain way.

We also know that Facebook is very politically active, spending around $10 million a year in lobbying. This means that the company clearly has an interest in certain political outcomes, and cares enough to be willing to devote time and money to get their way.

One more important piece of the puzzle, we also know that Facebook has both the means and the willingness to manipulate people's news feeds, and can do so secretly if it wants. We know of one secret experiment they did, where they manipulated hundreds of thousands of user's feeds in order to see if they could change their emotional states. They were able to do this in secret and we only know about it because they disclosed it. This means that they probably have done other experiments that they haven't disclosed, but more importantly, it proves that they can secretly do this manipulation if they want to.

And finally, and probably most importantly, none of this is illegal. Facebook has no obligation to be fair, neutral or unbiased in their filtering of what they present to people.

So, when you put all this together, Facebook has the means to secretly manipulate the news that millions of US citizens see every day, and they have political interests that are important enough to them to be worth spending millions of dollars on. Further, they have demonstrated a willingness to do this kind of secret manipulation of their users.

The only real reason to believe that Facebook won't try to manipulate the next US election is if they think it is likely that knowledge will be leaked, and if it is, that it would have greater long term repercussions to the company than not doing it would. Given how deeply entrenched Facebook is in the social media ecosystem, and also given that they could, of course, also limit how much people found out about it if it leaked, I don't think it's that crazy to suspect that they'll do it.

In summary:
  • Election results are influenced by the news and information that voters are exposed to
  • People are getting a large proportion of their news from Facebook
  • Facebook has political interests and already spends time/money on lobbying
  • Facebook can manipulate what news and information users see
  • Facebook has previously demonstrated the ability and willingness to manipulate user's news feeds, and was able to do it secretly
  • There is nothing illegal about Facebook doing this
  • The only real reason for them not to do it is if they think the general public might find out and the bad press will hurt them more in the long run than the gain

Now, I've been focusing specifically on Facebook and one particular event here, but most of the arguments apply to other social media companies and other kinds of manipulation. We really do need to be aware of just how much these companies are integrated into our lives and have the ability to control aspects of our knowledge. These are profit-driven companies whose primary purpose is to serve their customers and shareholders, and we're the product. We should not make the mistake of assuming that they have the greater interests of us or society at large as a main priority. They might, of course, but they're corporations. Generating profit is what they're legally obligated to do, and we need to remember that.

No comments:

Post a Comment