Facebook, Twitter, Google and other tech companies have been, and will continue to be, testifying before members of Congress about what the hell just happened. Specifically, lawmakers want to know how those platforms were used by foreign agents to disseminate false information both organically and through paid ads in the lead up to last year’s elections.

Given how much of the electorate gets their news through these platforms, the stakes are fairly high. Both Twitter and Facebook have continued to revise the number of people exposed to these ads and messages up as time goes on. At this point over half of all Americans saw some sort of propaganda designed to destabilize our democracy by inflaming racial, religious and other prejudices.

In response to all this, the tech companies in question have reverted to their favorite line of defense: “We’re just a dumb platform.” They claim they can only do so much because the users are responsible for the experience of what they see. They also downplay the efficacy of whatever ads were displayed, something they do have control over, seemingly unaware this position is exactly opposite to the “you need to buy ads on our network because they work” pitch made to every business in the country.

Ironically, the quagmire Facebook finds itself in hasn’t hurt advertising revenue as companies continue buying. It did, though, attempt to stave off regulation by claiming that really fighting the spread of fake news and weeding out manipulative advertising would cost so much it would hurt future profits. Essentially, it’s stating that if lawmakers really love capitalism they’ll back off. It’s the same argument the banks have made for years.

Those warnings may be true, though you could argue acting in the public good is more important to America than profit levels.

Instead of throwing money at attempting to solve the problem, there seems to be one simple solution that might hinder the impact of outside influence before it begins.


What if those additional hires (more likely low-paid contractors who don’t enjoy all the perks of full-time employees) were tasked with reviewing the authenticity of any new Page for someone claiming to be a news organization or advocacy group?

Based on the examples exposed in recent months, it seems most of these groups don’t stand up to the most cursory research, something that’s surely within the capabilities of companies like Facebook, Google and others. They have more tools than the average person and would be able to see that a group claiming to represent Native Americans just launched their website three months ago and why is it registered in Myanmar?

The companies in question would likely argue that such research and verification isn’t their responsibility. But it is. Landlords have to verify the party wishing to rent an apartment is who they say they are, and the repercussions there don’t affect the lives of all Americans in the way those using rented online land do.

Put those resources up front and stop the problem from becoming a problem. If anything comes out of the scrutiny being turned on the tech companies responsible for what we do and don’t see, I hope it’s that.

Chris Thilk is a freelance writer and content strategist who lives in the Chicago suburbs.