Data Privacy in the Age of AI Means Moving Beyond Buzzwords
Privacy is possible, but only if companies move beyond empty promises and commit to ethical data practices.
As tech companies fixate on taking advantage of the latest developments in artificial intelligence, people’s privacy concerns are being disregarded in the pursuit of new features and profit opportunities.
Many companies justify their actions by upholding a false narrative that people do not truly care about privacy, but any perceived apathy is a product of generations of companies choosing to not invest in giving customers meaningful privacy choices. Privacy is not dead, if anything it is more relevant than ever in the face of emerging AI tools built on people’s data. Companies need to acknowledge the importance of privacy and start investing accordingly.
In reality, it is companies themselves, not consumers, that often disregard privacy concerns. Look no further than the recent data breach at 23andMe as an example of a corporation blaming everyone but themselves for their own mistakes.
In recent months the company disclosed a data leak affecting half their customers, approximately 7 million people. For many, this included genetic information, sensitive health information, and a list of their relatives. Instead of acknowledging their own privacy failures, the company has responded by blaming users for not updating their passwords and downplaying the breach by claiming the information, “cannot be used for any harm.” The company is now being sued by users in a class-action lawsuit for negligence.
We do not have to live in a world of endless breaches and privacy violations. Companies can and should prioritize privacy to maintain trust with their customers, but this does not happen by accident. Instead, it requires an unequivocal commitment to privacy from both executives and builders and an ongoing investment of resources. It is not enough to say your company is applying “privacy by design” without actually translating privacy into real company policy and practices. Privacy considerations must be in the center of product decisions from the moment you decide to use people’s data, not be added on at the end in the form of a half-hearted “retrofit”.
Building for privacy will require assessing whether a company’s existing privacy metrics indicate anything of relevance. For example, simply having roles with “privacy” in the title is not an effective measure of a privacy practice. In the same vein, headcount is not a privacy solution. Just because Meta proudly claims it has 40,000 people working on their safety and security teams does not change the fact that, according to Consumer Reports, the average Facebook consumer has their data shared by over 2,000 different companies. Instead, companies should be focusing on metrics that evaluate data protection, customer trust and the enforcement of tangible privacy measures throughout an entire organization.
The relationship between ROI and privacy may appear at odds, but it’s a false equivalence. If you respect your customer, respect their data. This has to come from the top. This is a challenge for corporate leaders who are incentivized to focus on big, sexy innovation projects instead of mitigating privacy risks. We see this right now as companies rush to hire “chief AI officers” and deploy AI tools while the privacy implications of those tools remain an afterthought.
Leaders should care about privacy not just because it is ethical, but also because it is good for business. Privacy builds trust with your customers and increases their lifetime value to your organization. Polling from the Ethical Tech Project found privacy features increased consumer purchasing intent by more than 15% and increased trust by over 17%. Effective privacy measures also strengthen a company’s reputation, differentiate their product, and protect against ending up on the wrong side of an investigation by the Federal Trade Commission or a state attorney general.
Good privacy practices are possible, and they are attainable with a sustained, committed effort from corporate leadership and everyone who works with data. Thankfully, strategies exist to help business leaders. Two examples I am familiar with, among many, include The Ethical Tech Project’s privacy stack, a privacy reference architecture for technical teams, as well as the Center for Financial Inclusion’s privacy toolkit for inclusive financial products.
Privacy, or the lack of privacy, in modern technology products is a choice that every company faces. For the sake of their companies, corporate leaders can and should invest in offering their customers meaningful privacy options instead of empty promises.
About the Author
You May Also Like