Focus areas when implementing data protection by design and by default in 2026

Though data protection by design and by default is a well-established concept under the EU GDPR, its real-world application remains inconsistent. Four assessment factors can help with implementation.

Contributors:
Joanne Ro
Privacy Lawyer/DPO
Data protection by design and by default is one of the most widely accepted principles of the EU General Data Protection Regulation, yet nearly a decade after the regulation's adoption, its real-world implementation remains conversely implemented and inconsistently evidenced.
Article 25(1) of the GDPR requires controllers to put in place appropriate technical and organizational measures designed to implement the data protection principles and integrate safeguards into processing and protect data subject rights.
Four assessment factors — state of the art, cost of implementation, processing context and risks to individuals — can be considered collectively to determine whether the measures are appropriate and systems adequately embody data protection by design and by default.
In 2026, implementing data protection by design and by default is expected to require a separate approach for each of these four factors due to technological developments and current regulatory practices.
The importance of this concept is justified by the emergence of new regulatory initiatives in most jurisdictions around the world. Current trends lean toward detailed models of artificial intelligence regulation rather than general formulations. Such regulatory initiatives place the burden on businesses that use AI systems, namely the need to conduct structured risk assessments, document the principles of operation and intended purpose of AI systems, track training data, and implement control mechanisms when AI systems are involved in making decisions regarding the rights or interests of individuals.
Further development of the regulatory framework is expected to impose a new obligation on organizations regarding the use of AI systems. It is in this context that the implementation of the data protection by design and by default principle will require a more comprehensive approach, taking into account the technical characteristics of the systems, the nature of the development and use of AI technologies.
Focus area 1: State of the art
Typically, when implementing data protection by design and by default in practice, state of the art is perceived as a one-time assessment during the system's launch.
While the state of the art is understood as the level of technical and organizational development that is reasonably acceptable and effective at a given point in time, thereby requiring regular review and updates to ensure maximum protection of personal data, in practice, such assessments are rarely, if ever, reviewed.
Article 25 of the GDPR imposes an obligation on controllers to take into account the latest technological advances when determining the most effective technical measures. Thus, the state of the art concept is dynamic and evaluative, and therefore a controller's failure to continuously update safeguard measures in line with new market challenges may result in noncompliance with the GDPR.
However, technical measures become obsolete much faster than legislative requirements are updated. Solutions that were previously considered adequate may become ineffective due to the emergence of new privacy threats.
Another problem is the systematic underestimation of organizational measures. This is because the focus is often on technological rather than management solutions, staff skills development and control processes.
What 'state of the art' requires in 2026
The requirement to take into account state-of-the-art data protection by design and by default cannot focus exclusively on technological solutions.
It is a common misconception that when a company invests in the latest technologies, it expects this will be sufficient to comply with the requirements. However, due to the unchanging organizational environment, it is unrealistic to achieve the desired effectiveness.
Failure to regularly reassess the compliance of measures with the state-of-the-art principle may in itself indicate noncompliance with the concept. In this sense, state of the art is not a static level of protection but requires the controller's ongoing ability to adapt its technical and organizational measures to current trends.
Focus area 2: Cost of implementation
Article 25 of the GDPR explicitly allows controllers to consider the cost of implementation when choosing safeguards. However, the European Data Protection Board states cost is an optimization factor, not a reason to lower the level of protection.
In practice, this element is often interpreted as permission to implement weaker measures for budgetary reasons. Quite often, organizations express dissatisfaction with expensive data protection measures but fail to seek cheaper yet equally effective alternatives.
As a result, the cost itself may be used as an argument for refusing to improve the data protection architecture.
What 'cost of implementation' requires in 2026
Accounting for the cost of implementation within the data protection by design and by default framework means taking a comprehensive approach for choosing measures that are effective and proportionate.
First, controllers need to rethink the role of organizational measures. In many cases, these measures can noticeably reduce risks without significant financial costs. Implementing clear access rules, default functionality restrictions or regular staff training can substantially increase the level of protection.
Second, considering costs involves actively searching for and comparing currently available alternatives. Technological developments have led to the emergence of a variety of tools that are more flexible and automated than traditional architectural approaches.
Focus area 3: Nature, scope, context and purposes of processing
Beyond financial considerations, effective design requires an understanding of the processing environment itself.
Under Article 25 of the GDPR technical and organizational measures should be determined considering the nature, scope, context and purposes of the processing. These factors are not formal, as they should directly influence how the data processing system is designed.
The nature, scope, context and purposes of processing are usually set out in detail in the data protection impact assessment or internal documentation but rarely become a factor that actually shapes the architecture of privacy protection.
In addition, processing purposes are often formulated in the most general terms possible to preserve future flexibility. However, this approach contradicts the principle of purpose limitation and undermines the idea of data protection by design and by default.
What this factor demands in 2026
The implementation of data protection by design and by default should begin not with the selection of technology, but with an understanding of why the data is needed. Processing objectives must be defined before developing IT systems, as they determine what data may be collected, who may access it, retention periods and permissible system functionality.
Once the objectives have been defined, the logical next step is to understand the actual scale of processing. Today, the risks are not only related to the number of individuals, but also to the fact that data can be combined between different systems, reused for new tasks and transformed into detailed profiles.
Once processing scope is defined, organizations must recognize that legal and technical possibilities may diverge from customer expectations. Relationship context, user trust and vulnerability of different user groups — for example, children — should shape defaults and transparency.
Focus area 4: Risks to rights and freedoms of natural persons
Understanding the requirements for data protection by default and by design demonstrates the central role of risk assessment in ensuring the rights and freedoms of individuals. Organizations often struggle to meaningfully assess risks to individuals' rights and freedoms and many controllers perceive risk assessment as a formality within the framework of mandatory compliance procedures.
As a rule, companies prepare reports with conclusions that describe the nature of risks. But even this rarely influences the choice of technical solutions or the configuration of business processes.
As a result, the problem is that risk assessment is perceived as a bureaucratic step rather than an important tool.
However, it is important for organizations to understand that conducting a proper assessment is a necessary critical process for maintaining privacy.
In practice, more mature organizations do not view risk assessment as a formal requirement that is performed solely for compliance purposes. It is usually integrated into product and system development processes. In other words, risk analysis is not carried out after development is complete, but at various stages, namely during the design of a solution, the involvement of suppliers, or the launch of new features. This makes it possible to take into account potential risks to individuals' rights and freedoms before key technical decisions are made.
It is also important that general conclusions about the existence of risks are translated into specific measures in practice. For example, if the use of automated decision-making may create risks of discrimination or unfair outcomes, one way to mitigate them may be to introduce additional human oversight or procedures for reviewing such decisions.
In addition, the results of risk assessments are usually linked to internal information security and data management procedures. In this case, the risk assessment ceases to be just a separate document and actually influences the choice of technical and organizational solutions.
What a risk-based approach requires in 2026
A risk-based approach means working proactively. Risk assessment should be specific, not just a statement.
Risks increase when processing involves data of children or other vulnerable groups. In such cases, it is not only about data volume or sensitivity, but about the individual's ability to understand how their information is used and how to protect their rights.
The risks are further aggravated when processing involves automated decision-making and artificial intelligence systems which are often ambiguous to users, rely on large datasets and may reproduce or amplify errors and biases.
Since it is not enough to simply decide on protective measures, organizations must ensure they actually work. This involves checking whether the identified risks have been reduced after the implementation of technical or organizational solutions.
'By design' vs. 'By default'
The four focus areas discussed above define the parameters and features that must be taken into account by controllers. The main requirement of Article 25 of the GDPR is the application of these four factors prospectively, shaping design decisions before deployment and implementing defaults that are coherent with the design choices made.
By design and by default are intended to reinforce each other. While design embeds the capability for data minimization, default settings operationalize that capability without requiring affirmative action from the data subject. In practice, a common mistake is to think of data protection by design and data protection by default as a single requirement.
In fact, by design and by default operate at different stages of the system life cycle but have a common goal — to limit data processing before risks arise.
The data protection by design obligation arises at the stage of conceptualizing the technical task or project plan. This means data protection requirements should influence not the final settings, but the very idea for building the system.
In contrast, data protection by default behavior refers to how the system actually works after launch. In this context, standard settings determine the system's behavior without any additional actions on the part of the user.
Key takeaways
Considering the development of technology and the strengthening of general and specific regulatory requirements in legislation, the implementation of the data protection by design and by default concept will become even more important for organizations.
This does not mean formal compliance with GDPR requirements but creating practical risk management mechanisms in the field of personal data protection. Organizations that integrate these principles into their systems will have the opportunity, in addition to the obvious practical benefits, to gain the trust of users in their own services.
Looking ahead, data protection by design and by default is likely to gradually become a key element of the sustainable development of digital systems.
To implement data protection by design and by default compliance, consider:
- Embedding data-protection requirements into system architecture and vendor selection from the earliest design stage.
- Ensuring default settings enforce data minimization, restricted access, and proportionate retention without user action.
- Regularly reassessing technical and organizational measures against evolving risks, standards, and technologies.
- Performing risk and repurposing assessments before incidents or complaints arise.

This content is eligible for Continuing Professional Education credits. Please self-submit according to CPE policy guidelines.
Submit for CPEs

