Blog by Jay Stanley, Senior Policy Analyst, ACLU Speech, Privacy, and Technology Project. Originally published in ACLU's Free Future

The New York Times on Monday ran an extensive article on how the locations of millions of American are being tracked by apps on their cell phones, bought and sold, and used for advertising and other commercial purposes.

Is your location data among them? Do you know for sure? Every time you visit a doctor, bar, Planned Parenthood clinic, or friend’s house, is some company storing the when, where, and with whom?

I was recently speaking about privacy before an audience of government officials who had just received a pitch from one of these location data companies. I asked everybody in the audience to put up their hands unless they were positive that data from their phone was not being collected. Nearly every hand went up. I then asked people to raise their hands if they had consciously given permission for such tracking. Almost every hand went down.

That is the problem. Worse, if companies are collecting and warehousing these mountains of data, the government could get access to it as well.

The Times story, appropriately headlined “Your Apps Know Where You Were Last Night,” featured one woman whose location trails, collected by apps on her phone without her knowledge, showed her traveling between her home and the school where she teaches. They also showed her visiting a Weight Watchers center, a doctor, and her ex-boyfriend’s home. Another location record accessed by the Times tracked someone from a home outside Newark to a Planned Parenthood clinic.

The current state of our privacy is unacceptable. As new technologies make ever more intimate levels of tracking feasible, companies are competing to exploit them as quickly as possible, with the only limits being what can be done, and inadequate examination of what should be done. As a result, American consumers are subject to a level of monitoring that has never before been experienced in the history of humanity — tracking that is more extensive than many understand and more intrusive than most are comfortable with.

The heart of the problem with tracking apps and the rest of our corrupted privacy regime is that it has been built around the concept of “notice and consent”: As long as a company includes a description of what it is doing somewhere in an arcane, lengthy, fine-print click-through “agreement,” and the consumer “agrees” — which they must do to utilize a service — then the company can argue that it has met its privacy obligations.

Our ecosystem of widespread privacy invasions has been allowed to fester based on the impossible legal fiction that consumers read and understand such agreements. The reality is that many consumers can’t possibly understand how their data is being used and abused, and they don’t have meaningful control when forced to choose between agreeing to turn over their data or not using a particular service.

Worse, technologists and academics have found that advertising companies “innovate” by altering their tracking technologies specifically to resist consumers’ attempts to defeat that tracking. This is done, for example, by using multiple identifiers that replicate each other, virus-like, when users attempt to delete them. Advertisers, the experts conclude, “use new, relatively unknown technologies to track people, specifically because consumers have not heard of these techniques. Furthermore, these technologies obviate choice mechanisms that consumers exercise.”

In short, not only is there no meaningful way for consumers to control how and when they are monitored online, companies are actively working to defeat consumer efforts to resist that monitoring. Currently, individuals who want privacy must attempt to win a technological arms race with the multi-billion dollar internet-advertising industry.

American consumers are not content with this state of affairs. Numerous polls show that the current system makes people profoundly uncomfortable.

What’s needed is privacy legislation that includes a meaningful “opt-in” baseline rule for the collection of any information. By “meaningful,” we mean, among other things, that care be taken not to allow it to degenerate back into the current “notice and consent” regime where consumers are forced to “agree” to arcane agreements that they cannot understand.

The advertising industry shouts that such protections for American consumers will “ruin the free internet.” But there is absolutely no reason that needs to be the case.

An ad-supported ecosystem of services can flourish without collecting massive quantities of data about individuals in secret and without their consent. Broadcast television stations were an extremely lucrative business throughout the second half of the 20th century, yet broadcasters were never privy to the intimate details of their audience members’ individual viewing habits. Insofar as television ads were targetable at all, it was not through “behavioral” targeting, but instead through good old-fashioned “contextual” targeting, in which ads are matched to the audiences that different shows attract. This is an effective means of targeting ads online, and one that is perfectly consistent with strong privacy protections. An advertiser that wants to reach golfers, for example, can place its ads on a site about golf or on pages returning the results for golf-related search terms.

Where ad-based services have been built upon ethically problematic, non-consensual monitoring of individuals’ private lives, that monitoring should be rolled back, just as the telemarketing industry was rolled back by the “do not call” registry. This has not stopped progress or innovation in healthier areas that benefit consumers more.

If we protect privacy and constrain behavioral advertising, ad budgets will not dry up, and ad-supported offerings will not wither away. Nor will innovation in online and offline services simply cease because the advertising industry has been proscribed from taking behavioral advertising to the next, even more intrusive, level.

These companies are exploiting the inevitable lag between the moment when people’s privacy has been stolen by technology and when they realize that it’s been stolen. But in the end, those gaps will close because people demand privacy. Strong privacy protections that block the kind of things reported on by The Times are entirely compatible with a robust and flourishing economy, online and off.

In fact, such protections will establish predictability and stability of expectations that will enhance consumer confidence, prosperity, and innovation.