Data Privacy in the Age of AI Means Moving Beyond Buzzwords

Privacy is possible, but only if companies move beyond empty promises and commit to ethical data practices.

Maritza Johnson, Principal at Good Research and Board Member of the Ethical Tech Project

May 7, 2024

4 Min Read
wooden lock with word data as missing piece of the wood
Andriy Popov via Alamy Stock

As tech companies fixate on taking advantage of the latest developments in artificial intelligence, people’s privacy concerns are being disregarded in the pursuit of new features and profit opportunities.  

Many companies justify their actions by upholding a false narrative that people do not truly care about privacy, but any perceived apathy is a product of generations of companies choosing to not invest in giving customers meaningful privacy choices. Privacy is not dead, if anything it is more relevant than ever in the face of emerging AI tools built on people’s data. Companies need to acknowledge the importance of privacy and start investing accordingly.  

In reality, it is companies themselves, not consumers, that often disregard privacy concerns. Look no further than the recent data breach at 23andMe as an example of a corporation blaming everyone but themselves for their own mistakes.  

In recent months the company disclosed a data leak affecting half their customers, approximately 7 million people. For many, this included genetic information, sensitive health information, and a list of their relatives. Instead of acknowledging their own privacy failures, the company has responded by blaming users for not updating their passwords and downplaying the breach by claiming the information, “cannot be used for any harm.” The company is now being sued by users in a class-action lawsuit for negligence.  

Related:Transform Data Leadership: Core Skills for Chief Data Officers in 2024

We do not have to live in a world of endless breaches and privacy violations. Companies can and should prioritize privacy to maintain trust with their customers, but this does not happen by accident. Instead, it requires an unequivocal commitment to privacy from both executives and builders and an ongoing investment of resources. It is not enough to say your company is applying “privacy by design” without actually translating privacy into real company policy and practices. Privacy considerations must be in the center of product decisions from the moment you decide to use people’s data, not be added on at the end in the form of a half-hearted “retrofit”. 

Building for privacy will require assessing whether a company’s existing privacy metrics indicate anything of relevance. For example, simply having roles with “privacy” in the title is not an effective measure of a privacy practice. In the same vein, headcount is not a privacy solution. Just because Meta proudly claims it has 40,000 people working on their safety and security teams does not change the fact that, according to Consumer Reports, the average Facebook consumer has their data shared by over 2,000 different companies. Instead, companies should be focusing on metrics that evaluate data protection, customer trust and the enforcement of tangible privacy measures throughout an entire organization.  

Related:What the American Privacy Rights Act Could Mean for Data Privacy

The relationship between ROI and privacy may appear at odds, but it’s a false equivalence. If you respect your customer, respect their data. This has to come from the top. This is a challenge for corporate leaders who are incentivized to focus on big, sexy innovation projects instead of mitigating privacy risks. We see this right now as companies rush to hire “chief AI officers” and deploy AI tools while the privacy implications of those tools remain an afterthought.  

Leaders should care about privacy not just because it is ethical, but also because it is good for business. Privacy builds trust with your customers and increases their lifetime value to your organization. Polling from the Ethical Tech Project found privacy features increased consumer purchasing intent by more than 15% and increased trust by over 17%. Effective privacy measures also strengthen a company’s reputation, differentiate their product, and protect against ending up on the wrong side of an investigation by the Federal Trade Commission or a state attorney general.  

Related:What Is AI TRiSM, and Why Is it Time to Care?

Good privacy practices are possible, and they are attainable with a sustained, committed effort from corporate leadership and everyone who works with data. Thankfully, strategies exist to help business leaders. Two examples I am familiar with, among many, include The Ethical Tech Project’s privacy stack, a privacy reference architecture for technical teams, as well as the Center for Financial Inclusion’s privacy toolkit for inclusive financial products.  

Privacy, or the lack of privacy, in modern technology products is a choice that every company faces. For the sake of their companies, corporate leaders can and should invest in offering their customers meaningful privacy options instead of empty promises.  

About the Author(s)

Maritza Johnson

Principal at Good Research and Board Member of the Ethical Tech Project

Maritza Johnson, Ph.D., formerly with Facebook and Google, is a Principal at Good Research, Board Member at the Ethical Tech Project, and was the Founding Director of the Center for Digital Civil Society at University of San Diego

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights