The waves of negative press about privacy intrusions and breaches are becoming a regular occurrence. Yet the outrage cycle seems to have a short lifespan and there’s confusion and a range of expectations when it comes to data collection and usage. That said, consumers need to remain vigilant about new and unexpected ways digital organizations leverage personal data collection.
Already, we face some pretty significant questions: Do you want your genealogy company to sell analysis based in your DNA? Your smart doorbell clips accessible by warrant? Mugshot-matching facial recognition capabilities?
That’s why Stuart A. Thompson, graphics director of The New York Times Opinion section, and his team set out to figure out how people report their comfort levels in terms of sharing different kinds of personal information. The work, published earlier this month, is part of the ongoing Privacy Project at the paper. It’s a series examining the different ways private groups and governments are using personal information.
Understanding the limits
Perhaps encouragingly, The Times found that consumers are taking notice of the data collection policies of the services they chose to use. But they’re not exactly excited about it when they find out. Many of the activities asked about — which were found to be outside of 90% of respondents comfort levels — are already employed by tech companies.
Privacy by design
Thompson said the goal of this project is to learn what was “the line” for his readers and he used a pretty clever page layout to do so. Readers scroll through increasingly intrusive privacy scenarios and then place a line over the example which they felt uncomfortable with. This skeuomorphic interaction is a technique that The Times has previously used with good results.
“Drawing things was a New York Times/Upshot trademark way to get people to interact with us,” said Thompson. “We were talking through as a group, and I think we kind of fixated on the line.”
One element Thompson wanted to address head-on was the idea that consumers continue to make privacy related decisions that aren’t in their interest, dubbed the privacy paradox.
Understanding the big picture
However, to Thompson, they’re not acting irrationally or unexplainably. People are just caught in larger systems. Individuals aren’t given real opportunity to give consent of the personal data use, such as in cases where data is required to be handed over in order to access a service, and there aren’t consumer guardrails around those practices.
He added that tech companies have taken advantage of consumers’ difficulty with risk management. As a result, governments and regulatory bodies have had to enact legislation like the General Data Protection Regulation in the European Union.
The stakes get bigger
Another example of the push back against the surge in the surveillance economy was a lawsuit against the efforts by Alphabet subsidiary Sidewalk Labs in Toronto. The plan to build a smart city in the portland’s of Canada’s largest city could incorporate any number invasive new forms of data collection.
To nip that in the bud, the Canadian Civil Liberties Association launched an application to sue on the grounds that the wholesale data collection of people who just happen to be in the area aren’t allowed under the Charter of Rights and Freedoms protections including against unreasonable search and seizure. While this case is immediately relevant for the smart city project, it could help society establish a point at which it “draws the line” about privacy overall.
But at the end of the day, the path to better privacy management could start closer to home.
“As publishers are working in the public interest, maybe they have an additional responsibility to declare what they’re doing,” said Thompson. “You could argue that they are acting in this pure business sense and they have their own lines that they’re willing to to draw, [but] maybe publishers should think about what they’re comfortable with.”
Republished with kind permission of Digital Content Next, advancing the future of trusted content