OPINION

Privacy by design: How to avoid ending up on the front page for the wrong reasons

Apurva Wadodkar and Jigisha Pardanani discuss the importance of treating privacy as a foundational part of an organization's architecture, an approach known as privacy by design.

Published:

Subscribe to IAPP Newsletters

Contributors:

Apurva Wadodkar

Head of data and AI

TI Automotive

Jigisha Pardanani

CIPP/US, CIPM, FIP

Privacy Officer

SMBC MANUBANK

Editor's note

The IAPP is policy neutral. We publish contributed opinion pieces to enable our members to hear a broad spectrum of views in our domains.

Most privacy disasters don't begin with bad intentions. They often start with a shrug and a familiar promise: "We'll fix it later." 

But later is when regulators call, customers lose trust or a company trends on social media for undesired reasons.

When it comes to privacy, procrastination isn't just poor planning, it's one of the most expensive forms of technical debt teams can take on. Unlike messy code, privacy debt compounds risk, widens the potential blast radius and can turn small shortcuts into existential problems.

Data product teams move fast experimenting and optimizing for insights. When privacy requirements show up late, they feel like sudden speed bumps. The frustration is real and avoidable.

The solution is treating privacy as a foundational part of an organization's architecture, the same as reliability, scalability or cost. This approach is known as privacy by design and when done well, it doesn't slow innovation; it makes it sustainable.

Start with less data yes, really

Engineers love data. Product teams love insights. Privacy professionals, however, love one question above all others: "Do we actually need this?"

This question needs to sit at the heart of every ingestion pipeline, schema review and feature proposal. Collecting unnecessary data isn't just wasteful, it's a liability waiting for the right trigger. Every extra attribute stored increases the attack surface, compliance burden and cleanup cost when something goes wrong.

If a use case works with age ranges, there's no reason to store full birthdates. If analytics can run on anonymized or pseudonymized identifiers, there's little justification for keeping real names attached. If trends are all that is needed, aggregate early and make the data permanently anonymous.

Too often, teams collect data "just in case." That hypothetical future use almost never materializes but the risk absolutely does. Privacy by design isn't about saying no to data, it's about saying yes to the purpose.

Every data point should be able to answer three simple questions: What is it used for; who benefits from it; and what would break if it was not collected?

If those answers are vague, defensive or buried in technical jargon, that's a warning sign in the design. Here's a practical litmus test: would this data use make sense to a customer if it was explained to them in plain language? If the explanation starts with "well, technically…," it's probably time to rethink the design.

Know, own and classify the crown jewels

What isn't understood cannot be protected. A common failure in enterprise environments is treating all data as equal risk. It isn't.

Some datasets are low impact if exposed. Others' personal data, financial information and regulated content are an organization's crown jewels. When everything is lumped together, critical data winds up protected by the weakest controls.

Start with strong data classification. Not as paperwork, but as strategy. Categories like public, internal, confidential and regulated data create shared language across engineering, security and legal teams.

But classification alone isn't enough. Every meaningful dataset needs a clear owner.

Ownership isn't symbolic; it is accountability. Owners understand the data, approve its use, enforce protections and challenge questionable requests. When ownership is absent, responsibility evaporates.

Identifying an organization's crown jewels and assigning ownership can ensure protection that is proportionate, targeted and effective.

Lock the doors even on the inside

"Not everyone needs access to everything" is one of those statements everyone agrees with and then quietly ignores. Internal overexposure is one of the most common contributors to privacy incidents and it rarely involves malice. It's usually convenience.

When sensitive data is easy to access, it will eventually be accessed for the wrong reasons — curiosity, speed or simple misunderstanding. Privacy by design means applying least privilege by default, not as an afterthought.

Access should be role-based, intentional and reviewable. Production data should never casually flow into development or testing environments. Encryption should be standard, not special. Credentials and secrets should never live in plain text or shared documents.

Equally important is visibility. Detailed access logging isn't about surveillance; it's about accountability. Knowing who accessed what and when, is essential for audits, investigations and trust.

Third-party access deserves particular scrutiny. Vendors, partners and service providers often become extensions of an organization's data ecosystem. If sharing a company's personal data would be considered inappropriate in a given situation, then sharing customers' data under the same circumstances is equally inappropriate. Minimize what is shared, verify standards and revisit agreements regularly.

When access is intentional and controlled, privacy stops being a policy document and starts becoming a habit.

Decide when data dies

One of the most overlooked aspects of privacy is data life cycle management. Data is rarely meant to live forever, yet many systems behave as if storage were infinite and consequences nonexistent.

Every dataset should arrive with an expiration date already attached. Some data will naturally be short-lived. Other datasets, like financial records or audit logs, may need to be retained for years to meet legal or regulatory requirements. The key is that retention is deliberate, documented and enforced.

Privacy by design requires thinking about data deletion at the moment of ingestion, not years later during a panic. Automated retention policies, deletion workflows and cleanup of backups and downstream copies are essential. Manual processes don't scale and they inevitably fail.

Keeping data "just in case" is rarely a source of value. It is, however, a reliable source of risk. Old data is often poorly understood, lightly protected and forgotten. Exactly the combination attackers love. If the reason data still exists is not known, that's a signal something is wrong with the design.

Plan for things going wrong — because they will

Even with the best design, incidents happen. Systems fail. People make mistakes. The difference between a contained issue and a full-blown crisis is preparation.

Privacy by design means assuming failure and designing for response. That includes having a clear, tested breach response plan long before it is needed. Roles and escalation paths should be explicit. Teams should know how to recognize early warning signs and how to report them without fear or friction.

Privacy checks shouldn't live exclusively in legal or security reviews. They belong inside product development and release cycles where design decisions are made. Catching issues early is always cheaper, financially, operationally and reputationally.

Panic is expensive. Preparation is not.

The final test

Before shipping anything, ask one simple question: "If this design decision were public tomorrow, would we stand by it?"If the answer is yes, that's privacy by design. If not, it's the most valuable issue that could be fixed before it ever becomes a headline. 

Privacy done right doesn't slow innovation. It keeps it sustainable.

CPE credit badge

This content is eligible for Continuing Professional Education credits. Please self-submit according to CPE policy guidelines.

Submit for CPEs

Contributors:

Apurva Wadodkar

Head of data and AI

TI Automotive

Jigisha Pardanani

CIPP/US, CIPM, FIP

Privacy Officer

SMBC MANUBANK

Tags:

Program managementPrivacy

Related Stories