Social Media

Facebook To Subject Thousands to Bizarre Conditioning Experiment

I can’t be the only one who, upon reading the news that Facebook would be hiring 3,000 people worldwide to review reports of violent videos being shared, immediately sprang to the scene from A Clockwork Orange where Alex is forced to watch scenes of extreme violence. The goal of the process he’s put through is to turn him away from a life of crime, one where he thought nothing of murder, rape and other terrible acts and to make sure he can’t look away his head is held in place and his eyes forced open.

The hiring comes in the wake of a handful of recent instances where people have streamed themselves committing crimes – including rape and murder – to the social network via its “Live” video offering. The platform has allowed those who are already so inclined to sidestep existing media filters and take their message directly to the public, adding a voyeuristic element to their acts. They don’t just want to do it, they want to boast about it. That sort of thinking takes a special sort of twisted mind.

The Appearance of Change

Facebook’s goal is to at the very least *look* like it’s doing something to address what has been called out as a major failing of the network, that it’s slow to remove these videos and lacks clear editorial standards for doing so. Anytime a company announces it’s hiring a bunch of people to tackle a specific problem, the goal here is to change the conversation from “what it should be doing” to “what it is doing.”  Facebook has successfully accomplished that.

That the total team after this hiring would now, according to the announcement post by Mark Zuckerberg, total 7,500 people should tell you something about the scale of this problem. There are obviously a lot of people reporting videos and other posts as containing violence in some way or another. That, combined with other recent developments, brings to mind a few questions.

At What Point Is Facebook Dangerous?

This isn’t hyperbole, at least not much. If it takes a team of 7,500 people to begin to address reports of violent content how overall nasty and dangerous is Facebook? These people can’t be everywhere and even if they are, they’re squashing bugs and not dealing with stopping the infestation to begin with.

At some point someone needs to ask, particularly in conjunction with Facebook’s role in disseminating false or purposefully misleading news in recent years, whether it is substantially harming the public. By refusing to adopt (or admit to) an editorial mindset regarding the news that’s shared and by providing a free, unfiltered platform to violent people who are seeking attention, a case can be made that Facebook is doing more harm than good. It is not, to use a journalistic phrase, serving the public interest. And without even noticing we may have passed the point where it’s shifted from being a benign force that gives everyone a voice and a way to keep up with their aunt in Colorado to one where disinterest has caused it to be subjugated by the worst actors in society.

What’s the Long Term Plan?

It’s hard to believe that creating and adding to a team of thousands of human reviewers is a permanent solution. That simply goes against every direction Facebook is going in on other projects. The goal, I have to think, is to have these humans do their thing as a way of teaching an AI to do it automatically.

That assumption is based on precedent at not just Facebook but Google and other companies as well. Just last year Facebook laid off a much smaller team, just over a dozen people, who were responsible for managing its Trending Topics section. The goal of that team was to keep doing their job until the algorithm/AI was at a level where it had learned enough and could make decisions, essentially putting editorial responsibility in the hands of a machine. That turned out exactly as badly as everyone expected it to be.

This Impacts Us All

Facebook isn’t going anywhere anytime soon. Using it may be something many of us do reluctantly but it’s become so ingratiated in the fabric of modern communications it’s hard to draw a line and say you’re not going to be at least moderately active there. It’s an ecosystem that most of us, however grudgingly, live in and which impacts the larger environment for those who don’t. So the policies and procedures it puts in place for handling content on its platform is of material importance to everyone and isn’t just an abstract argument being had by a handful of media pundits (on Twitter).

A core problem here is the lack of transparency in place at Facebook over its procedures as well as no real appeals process for someone to contest a decision that’s been made in secret. Many content program managers have woken up one day and found their pages are gone because it was flagged as violating some term, with no recourse available. Similar fates could befall anyone at any time if their video of something as innocent as a paintball weekend with their coworkers is determined to contain violent material.

The future of media, both personal and professional, is being determined by a secretive tech company with zero accountability. Its latest move to increase the attention paid to violent videos may seem like an important update to tackle a difficult issue, but Facebook’s track record makes it easy to predict that this won’t actually move the needle on anything in a substantive way. That’s largely because it can’t – won’t – do anything to discourage you from posting everything you can. It can’t – won’t – put any speedbumps on the path between your life and your decision to make it part of your public persona, all in the name of gathering more data about you to sell to advertisers. It will make noises about what it’s doing to address this and other issues, but it’s easy to assume this will all be window dressing meant to look like action.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s