Data Privacy:Finding Balance

LEGAL • TECHNOLOGY 01.02.18

Modern life is raising difficult questions about consumer privacy and the data ownersip landscape.

LAST YEAR, EQUIFAX joined Yahoo, Target, Home Depot and Uber, as the latest major organization hit by massive data breaches when the financial data of over 145 million Americans was hacked between May and July 2017.

Dr. Barry Devlin, an expert on business insight and big data, has proposed that consumers should ultimately have the right to license their own data to other parties “under defined, binding and enforceable conditions,” and that “care of data licensed in this way remains the responsibility of the party holding the data,” in a tdwi.org article.

The European Union (EU) will point a bright light onto the data ownership landscape beginning May 2018 by giving its citizens more control over their data with the General Data Protection Regulation (GDPR). The reform will place “strict regulations over the collection of personal data and issue penalties for its misuse and exploitation by those who gather it,” reported by zdnet.com.

Data can solve big societal problems — Technology, innovation and privacy can coexist.

Although there are no similar reforms in the United States, the Future of Privacy Forum (FPF), a non-profit advancing principled data practices in support of emerging technologies, “helps fill the void in the space not occupied by law which exists due to the speed of technology development.” Their vision, as self-defined “data optimists” is that data can solve big societal problems, and that technology, innovation and privacy can coexist, fpf.org.

Seventy-two percent (72%) of Americans consumers believe businesses are best-equipped to protect their data. Yet, many, from 49% to 55%, worry technologies like artificial intelligence (AI), the internet of things (IoT) and machine learning are risks to their privacy, according to the PwC Survey and Consumer Intelligence Series: Protect.Me

Tech is neutral — can we maximize benefits and minimize risks?

The technology is neutral. “At the end of the day the question is how are we going to maximize the benefits and mitigate the risks,”said FPF’s Vice President of Policy John Verdi at the PwC US Privacy Retreat in Minneapolis last November.

Machine learning and AI have the potential to “bolster Privacy-Enhancing Technologies (PETs).”

“Entities can make better inferences” about what a person’s privacy preferences are  than could humans analyzing what consumers do  with information. This, however, “raises challenges about accountability” and “how those algorithms are making decisions that impact individuals.”

There is  potential for emerging technologies to “push even further”  helping consumers control their privacy, explained Alessandro Acquisti, Professor of Information Technology and Public Policy, Carnegie Mellon University, who joined the discussion.

Consumers are inundated with endless privacy updates from many devices and online services, but providing them with “apps and tools that learn over time from their choices and preferences,” will help “navigate the endless privacy choices.” Consumers have spoken, the same PwC survey shows that 71% of consumers find companies’ privacy rules difficult to understand. PwC Consumer Intelligence Series: ProtectMe.

Privacy Enhancing Technologies protect data but degrade quality. Who will bear the cost?

There has been much work done in PETs in the last 20 years “which can address privacy in almost every conceivable scenario. The technology is there.” Everything being done with data today can be done “while simultaneously protecting some data.” However, “removing data from a transaction degrades” its quality.

“The essential question” for “economists to address together with technologists” is “who will bear the cost” of the degradation. Consumers who will receive less useful information? Companies not able to fully monetize their data? Or society as whole, which will “lose the ability to innovate as quickly,” concluded Professor Acquisti.

Concrete ways to protect yourself now.

Mr. Verdi noted the “threat models that most individuals are facing have very specific vectors: mobile devices, email and financial accounts.”  He shared concrete examples to use now that “will mitigate concerns about safeguarding data.”

Use a password  manager to generate long passwords;

Use two- factor authentication;

Use the highest level of security on mobile devices.

Elaine Sarduy is a freelance writer and content developer @Listing Debuts