It’s kind of remarkable that this post got published. Facebook was reacting to reports from over the weekend of people finding their entire phone and chat histories were part of the data they were exporting from Facebook. They were shocked to find that years and years of non-Facebook usage was included. In the wake of the continuing Cambridge Analytica fallout, everyone was very receptive to another story about Facebook and data tracking.

The company’s response, though, was shockingly tone deaf, as most things coming out of Silicon Valley tech companies usually are. Facebook very clearly states that it is *not* pulling this data without the permission of the individual, that they have *agreed* to allow access to this behavior.

Yeah…not actually what we’re discussing here. That may have been the way the story was framed by some writers, but it’s not what we’re talking about.

It didn’t take long for everyone realize that the problem with the Cambridge Analytica situation wasn’t that the data had been stolen, or even necessarily that there had been a breach. Certainly the use of data in a way that violated Facebook’s terms of service for partner companies was troubling, but what really became crystal clear is just how much data companies have on us.

I’m not the first one to admit that I’ve willingly played my role in the data economy, willingly trading information about myself for access to social networks and other sites. I’ve clicked the “Agreed” box on terms and conditions without reading all 57 pages of legalese. That’s on me, a decision I’ve made freely.

What Facebook’s statement misses is that we realize they weren’t doing it secretly. It’s just that now we’ve been shown what’s behind the curtain and are no longer cool with the terms of engagement. We would like to renegotiate – or at least discuss – the contract between the public and those harvesting our data for purposes we’re not always going to be on board with.

Not everyone can embrace the #DeleteFacebook movement for one reason or another. That’s reality. It might be their only connection with their family, a way to access an important support community or because it’s necessary for their job. The reasons will be varied, but the result is they can’t do it.

There needs to be another way, then. And I maintain that simply rebalancing the power dynamic between those who want to collect our data and information will be enough to keep government regulation, which more and more people are in favor of, to a minimum. If the U.S. were to adopt something like the General Data Protection Rule that’s about to go into effect in Europe people could opt-in to specific activities instead of being subject to a blanket agreement saying all data is on the table. If they don’t mind personalized ads themselves but don’t want to be part of vaguely defined research, they should have that option. There are other, smarter people who have reasonable solutions to this.

In short, as Dave Winer and others have said, this is *our* data and the structures of the online world need to be realigned to put the power back in the hands of the public. We should know how it’s being used, be able to export it and take it elsewhere easily and make changes when we see fit.

Let’s also be clear: This is not a problem that’s unique to Facebook or any social tech company. The public has a right to know how the information collected on us through retail loyalty cards is being used. Or how research companies are tracking us as we walk through stores, museums or other public locations. It extends to government as well.

For Facebook, a good first step would be to full understand the conversation that’s happening and acknowledge the nuance between “I didn’t know this was happening” and “I just discovered this secret thing was happening.” The company is understandably feeling pressured because of the scrutiny being applied to it resulting from the Cambridge Analytica situation and may have rushed this one out quickly after coming under fire for a delayed response after the CA revelations.

The lack of emotion and understanding in that statement almost makes it seem there were no non-techs involved in drafting it, which is a bad idea when your corporate reputation is taking such a hit. At this moment in time Facebook – and all tech companies – need to be very careful about their public pronouncements. There’s more and more call for the upending of the bro culture that has held fast in tech for so long, one that has overlooked potential problems because the team involved had only one point of view.

Having a jerk in the room, someone who is willing to take the most annoying stance on an idea to test its feasibility, is a good idea. What it seems is needed now is a Chief Empathy Officer, someone who is specifically tasked with considering the feelings and needs of others and who has veto power over important product and marketing programs. Such a role might upset some apple carts, but it’s going to help. Not only would some bad product ideas be squashed early on but statements like the one Facebook released around activity tracking could be made much, much more appropriate.

Chris Thilk is a freelance writer and content strategist who lives in the Chicago suburbs.