Google is rolling out Feed, a personalized stream of news and information it feels will be relevant and interesting to users, to Android phones now with iOS and desktop web versions coming soon. You’ll be able to offer the algorithm suggestions and tell it when something doesn’t strike your fancy to help improve suggestions and follow certain topics that are shown.
That’s just another example of a company essentially deciding you’re not smart enough to be left in control of your own media experience. Instead of offering a blank canvas for you to fill in on your own, Feed gives you a select set of materials already arranged in a particular order it feels is optimal, leaving the user only enough freedom to rearrange things slightly.
More that even, we’re asked to train systems in what we like. All those likes and dislikes and other nudges are meant to help the AI that powers news display learn more about us and theoretically offer better content.
What really jars me is that there’s an inherent lack of logic behind all of these requests – be they from Facebook, Google or any other company – for readers to train the AI that powers these services. If the reader is so smart and so in tune with what they want, then why put an algorithm between them and their feed? Why not let them take the positive action to follow a social profile or subscribe to a feed and let them sort it out and manage the inputs as they see fit?
It’s because this isn’t about letting people create the media experience they want. RSS did that. Even Twitter’s firehose, unfiltered approach to updates does that. Everything else isn’t about allowing people to exercise control, it’s about gathering data on them that can be used to better target advertising. Every signal that’s sent is one more datapoint that can more finely-tune the next wave of ads.
Google’s Feed may be fine, but it still comes from a mindset that believes you’re not smart or responsible enough to be left in control of your own media experience. That’s an approach that not only carries with it plenty of opportunity for abuse but, at least to date, has come with zero accountability for problems and abuses that have already popped up.
Give people they tools they need to create their own personalized feeds. Don’t force a model on them because of the arrogance that believes programmatic curation is better, especially not when the real goal is just more and more intrusive advertising.
Chris Thilk is a freelance writer and content strategist who lives in the Chicago suburbs.