Japan's APPI amendment bill would open narrow lane for some AI uses, tighten rules elsewhere

A bill to amend Japan's APPI would codify a new concept of "statistical processing," while putting sharper boundaries around minors' data, facial feature data, opt-out sharing and enforcement.

Contributors:
Takashi Nakazaki
Partner
Anderson Mori & Tomotsune
Japan's Personal Information Protection Commission announced 7 April that the Cabinet of Japan approved a bill to amend the Act on the Protection of Personal Information. The PPC frames the package as accomplishing two things at once: facilitating data use, including artificial intelligence-related data use, while strengthening protection and enforcement where data handling creates greater risks for individuals.
The bill, which does move in both directions simultaneously, is not a wholesale rewrite of Japan's privacy law, enacted in 2003. It is a targeted reform package, and a meaningful one.
Unlike the EU General Data Protection Regulation, which generally begins by asking what legal ground supports data processing, Japan's amendment bill does not try to import a new across-the-board lawful-basis model into the APPI. Instead, it works more surgically, by changing specific consent-sensitive parts of the APPI and building new rules around higher risk practices.
Under Japan's private-sector framework, the more important questions are usually about purpose limitation, use beyond the stated purpose and rules on third-party sharing and overseas transfers. So, for most privacy teams, the more useful question is not whether Japan is moving closer to Europe. It is which APPI rules would be loosened, and which would get tougher under the amendment bill.
A new statutory concept, not a free pass
The most significant change is the introduction of a new statutory concept, "statistical processing," defined as the creation of statistics and similar analytical acts that derive trend- or characteristic-level information from large volumes of information, excluding information about individuals, where the activity is low risk to individuals' rights and interests.
Just as important, this is not a case where the government is newly legalizing statistical use from scratch. The PPC's existing FAQ already says that because purpose specification applies to "personal information," statistical data that is no longer personal information falls outside that rule, and the act of converting personal information into statistical data does not itself need to be separately specified as a purpose of use.
What the draft bill does is take that interpretive position and give it explicit statutory footing through a newly defined legal concept.
The bill then uses that concept to create concrete no-consent pathways. Subject to conditions, businesses could acquire publicly available special care-required personal information without consent where it is needed for "statistical processing, etc." They could also provide personal data or personal-related information to a third party without consent where the recipient needs it for that purpose, required matters are publicly disclosed and the parties enter into a written agreement reflecting the special framework.
Under the APPI, special care-required personal information refers to sensitive data that could lead to discrimination if misused, including race, beliefs, criminal history, medical history and disabilities. It requires strict protections, including mandatory prior consent for collection.
The amendment outline also adds guardrails: public disclosures must continue for a prescribed period and the data cannot be used beyond the disclosed statistical purpose or re-shared except in limited cases.
This is where AI comes in. The PPC's explanatory materials expressly state this new category can include AI development where the activity can properly be characterized as "statistical processing." So this is a narrow lane for certain analytical and AI-related uses, but it is not a blank check.
That may sound technical, but this is exactly the kind of point that tends to divide legal, product and data teams. Companies will need to decide whether a live project really fits the new statutory box, or whether it only looks close enough at first glance.
Easing narrower, consent-heavy scenarios
The draft bill also exempts the consent requirement for collecting special care-required personal information, third-party provision or alteration of utilization purposes in a few narrower scenarios where it is clear, based on the circumstances surrounding the collection of such personal information, that it is not contrary to the individual's will and, therefore, clearly does not harm the individual's rights and interests. This includes cases where the processing is clearly necessary and unavoidable for the performance of a contract with the individual. The PPC, which is expected to identify other cases that fall under this exception by regulation, gives practical examples, including hotel-booking arrangements and overseas remittances.
It would also broaden current life, body, property and public-health exceptions beyond cases where obtaining consent is difficult, to cases where there is otherwise reasonable justification for not obtaining it.
The bill also clarifies that hospitals, clinics and other bodies whose purpose is the provision of medical care fall under "academic research institutions, etc." for purposes of the APPI's research exemption. This includes a university or other organization or group associated with academic studies, or a person belonging to it. While this has traditionally meant universities and academic institutions, whether medical institutions were included has been unclear. This clarification is significant for real-world life sciences, health data and medical research projects.
The tone changes turning to children's data, facial feature data
Where the draft bill gets much tougher is with children's data and facial feature data.
On children, the PPC says if an individual is under age 16, legal representatives would become the relevant counterparties for consent and notice-related matters. Minors — or their parents or statutory representatives — can, as a general rule, request the suspension of use, deletion or suspension of third-party provision of their retained personal data without having to meet the requirements that apply to adults. For example, adults may be required to show that the data is no longer needed or that their rights or legitimate interests are at risk.
The bill would also establish a new overarching duty for businesses processing children's personal data to take necessary measures to prevent harm to their rights, prioritizing the child's best interests considering their age and developmental stage. Furthermore, parents and other statutory representatives must also prioritize the minor's best interests when exercising rights or giving consent on the minor's behalf.
This moves a topic that has often lived in guidance more firmly into the statute. And for businesses running consumer-facing services in Japan, is likely to push age-gating, parental involvement and product design back onto the legal and compliance agenda.
Facial feature data is another standout. This is a main category of specified biometric personal information introduced in the bill, focusing in particular on data derived from an individual's face that can be used to identify them. The emphasis is not on an ordinary image, but the extracted feature data that can be used for identification. The bill would require prior notice of specified matters such as the fact that specific biometric data is being handled, the purpose of use, the specific physical features being converted and procedures for exercising rights. It would also prohibit third-party provision of that data and broaden requests to stop use of the data.
In practical terms, identification-oriented biometric uses would move into a more tightly regulated bucket under Japanese law.
Some of the biggest compliance effects may come from the less glamorous changes
Several of the more operational changes within the amendment bill may end up having the broadest day-to-day impact, for example, outsourced data processing. The bill would expressly prohibit a contractor from handling entrusted personal data beyond what is necessary to perform the outsourced work. At the same time, if the contractor is genuinely instruction-bound and operates within a contractually defined framework, the processor would be exempted from the vast majority of general obligations under Chapter 4, Sections 2 to 4.
Specifically, the entrusted data processor would no longer need to individually notify individuals of the purpose of use or respond to data subjects' rights directly — which should be handled by the entrusting party. However, certain fundamental duties always remain applicable to the processor, including security management measures and the obligation to report data breaches. In other words, the bill tries to clamp down on processors using customer data for their own purposes, while calibrating obligations for service providers that are just carrying out instructions.
Another is incident response. The PPC says direct notice to individuals could be relaxed where the risk to their rights and interests is low, with substitute measures available instead. At the same time, unlawful third-party provision would become a reportable event. The direction here is not simply "more reporting" or "less reporting." It is a more risk-sensitive system, but one with some new triggers.
The bill also creates a new category of "contact-enabling personal-related information," defined as personal information containing descriptors that can be used to contact or otherwise transmit information to a specific individual. Examples include phone numbers, addresses, emails and cookie IDs. The point is straightforward: even if data does not identify someone in the strict APPI sense, it may still enable phishing, fraud, invasive outreach or harmful linkage. The bill would prohibit improper use and improper acquisition of this category of data. It would also tighten the opt-out regime by requiring providers to verify the recipient's identity and purpose of use in advance.
Enforcement is where the bill really shows its teeth
The enforcement package is not an afterthought. The bill would enable the PPC to move faster by issuing an emergency order not only where serious harm has already occurred, but also where infringement is imminent. It would also let the PPC require notice to affected individuals or public disclosure where needed to protect individuals' rights and interests.
And if the target of an order does not comply, the PPC would have an express basis to request that certain service providers stop the relevant processing or stop providing the relevant service.
Proposed sanctions are just as notable. The bill would increase criminal penalties, extend punishment to disclosures made for harmful purposes and create new penalties for wrongful acquisition through fraud and similar conduct. It would also introduce a limited administrative surcharge regime aimed at serious, economically motivated violations. For example, no surcharge may be imposed if the business exercised due care or if fewer than 1,000 individuals are affected.
This would be a major step up in Japan's privacy enforcement.
What next?
The first question for organizations is whether any AI or analytics project that falls within the scope of Japanese law can genuinely fit within the new statutory concept of "statistical processing, etc." The second is whether Japanese-facing services need a revised approach for users under age 16. The third is whether facial recognition, vendor contracting, incident response, opt-out data sharing and broker-style data use need another look.
The bill would take effect within two years of promulgation, probably in the spring of 2028. Many of the crucial operational details would be determined through upcoming PPC rules and related guidance. So, while we can see the big picture now, the heavy lifting of implementation is still down the road. The harder work will be in deciding what needs to change in existing products, vendor terms and data-sharing arrangements.
The views expressed in this article belong solely to the author.

This content is eligible for Continuing Professional Education credits. Please self-submit according to CPE policy guidelines.
Submit for CPEsContributors:
Takashi Nakazaki
Partner
Anderson Mori & Tomotsune
Tags:



