Advocating for Privacy and Ethical Decision-Making at Work

View

The Business Value of Privacy

📚 Reading Assignment: Privacy Is a Business Opportunity - David Hoffman, Harvard Business Review

"...privacy protection should be a practice as fundamental to the business as customer service. Privacy is an essential element of being a good business partner...any business will benefit from proactively tackling privacy issues in one of three primary ways: protecting your brand, offering a competitive advantage from integrating privacy and security features into products and services, and creating new products and services designed to protect personal data.

There have been many breakdowns in security or privacy causing significant, and even irreparable, damage to companies’ reputations. For context, Young & Rubicam (Y&R) estimates that brand value represents nearly one-third of the $12 trillion in market capitalization of the S&P 500."


There are a range of arguments you can use to try to convince your organization's leadership of the business value of privacy. Know your audience: each person will be more receptive to some of these perspectives than others. Importantly, never present privacy solely as compliance risk. In your author's experience, leaders hate thinking about compliance and - with weak enforcement regimes in many countries with data protection legislation - they will try to dodge it, betting that they will never get caught. Instead, you could tell them:

  • "Privacy builds customer trust, giving us a competitive advantage over other companies who may suffer irreparable damage to their reputations due to breaches."
  • "Privacy enables us to offer new features and services to our customers, providing great return on investment."
  • "Deploying Privacy Enhancing Technologies will enable to us to unlock new data use cases that provide value both to us and our customers. We can offer them new insights from data and even unlock analytics partnerships with other organizations."


Environmental, Social, & Governance (ESG)

📚 Reading Assignment: What Do ESG and Ethics Have In Common? - Ann Skeet, Markkula Center for Applied Ethics at Santa Clara University

"...the courage of people serving currently in board and management roles has led them “to set goals to be realized over the long term, to serve interests beyond just those of financial investors, and to actively design the larger business system they operate within to promote ethical behavior, considering a broader set of stakeholders.

By co-creating standards with competitors, regulators, and lawmakers, business executives are showing that ethics is a vital part of business purpose. In very tangible ways, today’s executives are putting into practice the idea business exists to optimize collective value...We know it most familiarly as the common good."


Does your company already have an ESG strategy? If not - now is the perfect time to establish one! If one already exists, advocate for incorporating data ethics and human rights protections into your strategy. This might include:

  • Creating a company Code of Ethics (one you actually live and discuss throughout the company, not a token PR statement!) and dedicated Data Ethics and AI Ethics Guidleines.
  • Endorsing - and following! - the United Nations Guiding Principles on Business and Human Rights.
  • Running privacy and ethics training. You might already have some basic 'compliance ethics' training in place - which nobody pays attention to - but are you actively debating ethical dilemmas and relating them to your product? Do you and your colleagues know what your stances are on moral relativism? Great starting points would be to introduce your colleagues to: this course itself; to the Foundations of Humane Technology course; to the Trust & Safety Teaching Consortium's course materials; to Ethics Unwrapped (including the Giving Voice to Values framework); and to the Seven Pillars Institute's Ethics 101.


To Drive Cultural Change, Get People Talking

Continuing the theme of engaging your colleagues on privacy and ethics - why not kick off a lunchtime discussion series exploring case studies of ethical dilemmas in tech? Did a competitor just make a controversial product change? Let's talk about it. What are the pros and cons of tech companies withdrawing from Russia in response to the war in Ukraine? What should your company do in the case of a future conflict elsewhere? The topic of generative AI - and more broadly, which capabilities we're comfortable building into our products, and which not - also provides ample scope for debate, as do the dilemmas of user content moderation and ongoing attempts to ban end-to-end encryption. You might find this template for technology ethics case studies a helpful starting point for your discussion of cases such as these:



Threat Modeling Is For Everyone

Threat modeling shouldn't be conducted solely by your security and privacy specialists. Diversity of perspectives is essential to designing more privacy-preserving, inclusive products, so get everyone onboard! Making sessions as interactive as possible and gamifying them can help to get people engaged. You could try:

  • Running threat modeling workshops applying LINDDUN and your privacy toolkit (contextual integrity, the privacy-utility tradeoff, privacy principles, etc.) to upcoming product features or ideas. Print out the LINDDUN Go card deck to make this more interactive.
  • Privacy threat modeling frameworks are still evolving. They each have different focuses, and it will likely take some experimentation to discover which framework is the best fit for your own reasoning style and for your company's products. Other frameworks to check out include the NIST Privacy Framework and MITRE's PANOPTIC (Pattern and Action Nomenclature Of Privacy Threats In Context).
  • Alternatively, there are more general frameworks for harms modeling, where privacy threats are just one category of potential threats. The Open Data Institute provides the Data Ethics Canvas, and Microsoft has developed a game for ethical product development, Judgment Call, which uses their in-house harms modeling framework.


Remember That Change Won't Happen Overnight

It can be a struggle to keep going when your organization's values aren't aligned with your own. Remember that you're not alone in this - there are many people advocating for ethical technology, often failing on the first, second, or even tenth try, but they persist. As frustrating as it is to not be able to immediately put a stop to harms, cultural change is a marathon, not a sprint. You need to give your managers and colleagues time to develop their ethical awareness and come to their own conclusions. You might find joining communities of like-minded people, such as All Tech is Human, helpful for advice and support. Keep your sense of humor and try to be patient. But equally, if you've given your organization second chances over and over and the situation feels abusive - leave. Don't let them burn you out. They don't deserve you, and you will be able to do so much good elsewhere.

"Do not be daunted by the enormity of the world’s grief. Do justly, now. Love mercy, now. Walk humbly, now. You are not obligated to complete the work, but neither are you free to abandon it." - Rabbi Tarfon, Pirkei Avot



Further Reading