Consumer News

TINA’s Take: ‘You Can’t Interrogate an Algorithm’ attends FTC hearing on how to protect consumers from "informational injury" in the digital age.

Consumer News

TINA’s Take: ‘You Can’t Interrogate an Algorithm’


The FTC is facing a technology crisis over how to measure and protect consumers from “informational injury” in the digital age. When a hotel overcharges guests, it’s easy to calculate the damage. But how do you put a price tag on someone’s personal information, particularly when a company that says the data will be protected breaks that promise?

That’s just one of the questions that the FTC is hoping a series of hearings on consumer data security that kicked off last week with in attendance will answer. Another one: How do you prevent and police informational injury? Because, as one panelist who served as director of the FTC’s Bureau of Consumer Protection (BCP) under President Obama put it, “You can’t interrogate an algorithm.”


Just by using some websites and apps, you are agreeing to a privacy policy that dictates how your data will be used and/or shared. While consumers have expressed a desire to know how their data is being used, rarely do they read the privacy policy. Even if they do, it can be difficult — if not impossible — to make sense of the document, which is often several pages of dense, legal jargon. Some consumers think that if a company has a privacy policy, it doesn’t share information with third parties. This is not the case.

“Consumers come in with this baggage and we have to play in that world,” said Daniel Solove, an expert on information privacy law and professor at Georgetown University Law Center, where the hearing was held last week. “There should be some protection so they’re not exploited.”

Solove called the current state of data security regulations “a mess.” He said there are inconsistencies in various laws across state and federal government and much of what is on the books is unfit to meet the challenges of today’s technologically advanced world. The FTC mainly uses technologists for forensics in investigations, said David Vladeck, the former director of the BCP under Obama.

Meanwhile, despite what may be outlined in a privacy policy, there is a lack of transparency in how companies share data, especially when it comes to artificial intelligence and the algorithms through which information travels. Vladeck called it “a black box system” where consumers can be subject to “tyranny by algorithm.” Not wanting to hand over the keys to the machines just yet, the FTC is now grappling with how to protect consumers and hold companies accountable without stifling future innovation.


Several panelists brought up the FTC’s case against Ashley Madison as an example of what could go wrong when a company fails to deliver on its promises to protect consumer data. The dating site, which encourages people who are married or in a relationship to have an affair, marketed its services as “100% secure and anonymous.” Then, in 2015, the site was hacked, exposing the names, relationship statuses, sexual preferences, and desired encounters of more than 36 million users. This even included those who had paid $19 for a “Full Delete” to scrub their data from the site. Not only did the data breach result in broken marriages, people committed suicide, Vladeck said.

“Labeling that kind of harm is difficult,” he said.

Ashley Madison paid $1.6 million to settle the FTC’s allegations that it deceived consumers on the security of the site. The settlement also required the company to implement a “comprehensive data-security program.”


Technology gives consumers a lot of what they want: free content, information at their fingertips, the ability to stay connected with friends and family even if they’re on opposite ends of the world. And, as Howard Beales, who served as director of the BCP under President George W. Bush, pointed out, big data can help confirm a person is who they say they are, protecting against identity theft. But that doesn’t mean tech companies should be given carte blanche to collect and share data without giving consumers fair warning, regardless if they’re “motivated to do the right thing,” as one panelist representing the industry posited.

As the FTC seeks a regulatory middle ground that doesn’t fence in future innovation, a good place to start would be requiring companies to write their privacy policies in plain English and disclose them where consumers can see them, i.e., not at the very bottom of their website. That, at least, would give consumers a fighting chance.

Find more of our coverage on privacy and security issues here.

You Might Be Interested In