Japan relaxes privacy laws to make AI development easy

[. . .] Japan’s government on Tuesday approved amendments to the nation’s Personal Information Protection Act that remove the requirement for opt-in consent before sharing personal data.

The amendments require those who acquire facial images to explain how they handle the data, but offering a chance to opt out won’t be mandatory.

Organizations that collect the wrong data, or maliciously use it to harm citizens, will face fines equivalent to the profit they make from improperly using data. Japan’s government will also implement fines for obtaining data through fraudulent means.

But in the event of a data leak, organizations will not need to notify impacted citizens if there is little risk of harm to individuals.

Despite its reputation as a hotbed of technology, Japan has been markedly slow to digitize government services. These amendments are aimed, in part, at making sure Japan is not slow to catch the AI wave.

Below is a press release article concerning these amendments from the website of Japan’s Personal Information Protection Commission (PPC), which contains some untranslated source materials

1 Like

Japanese PM probably struck some shadowy deal with the devil incarnate (aka Peter Thiel) when they met last month.

https://xcancel.com/mrjeffu/status/2029570467393044510

It’s awesome that every government is kowtowing to an industry that’s unprofitable, makes stuff that everyone hates, and drives up the prices of everything.

This decision had been obvious from the start, back when the expert panel was meeting.

The medical industry had really pushed for access to certain types of information, like medical histories.
So, expert panel said it was OK to process sensitive personal data without consent, but only to create statistical data, including AI models.
The Japanese government thinks that statistical information that doesn’t specifically name individuals isn’t a big risk to personal rights and is therefore fine.

Many Japanese experts tend to focus more on how statistical data is used than on the creation of that data itself.


Facial scans are also fair game. The amendments require those who acquire facial images to explain how they handle the data, but offering a chance to opt out won’t be mandatory.

This have been perfectly legal long before this amendment was even suggested.
Big shops have been using AI to scan people’s faces, bodies and gaze to collect information about gender, age and interests.

For example, the convenience store chain FamilyMart is taking steps to do this from 2021. This is their privacy policy (archive.md). The following is an excerpt from the relevant section, translated into English:

We collect information—including the age, gender, visibility of digital signage, and number of visitors to the relevant stores—of customers using the relevant stores (hereinafter referred to as “Target Stores”), as inferred from full-body and facial images of customers (hereinafter referred to as “In-Store Camera Video Data”) captured by marketing-specific cameras installed at each FamilyMart store operated by FamilyMart Co., Ltd. or its franchisees (hereinafter referred to as “Target Stores”). This information is referred to as “Estimated Results Data.”

Because information about how their data is used is often shown as small stickers at the entrance of shops, most people don’t see them.
This means their data is collected for things like making statistics, but they almost don’t know this is happening.