The images were gathered to facilitate the development of a private AI system, implementation of which could allegedly speed up and improve the current identification methods at Incheon International Airport.
The S. Korean government was involved in a large-scale, and illegal, distribution of personal data. Close to 170 million images, taken at the country's largest airport, ended up in the hands of a private AI developing company. The cameras caught faces of domestic and international travelers from many angles and with different expressions and, worst of all, without a single official consent.
According to Park Joo-min, a Democratic Party lawmaker who received the biometric data from the Ministry of Justice (MOJ) and Ministry of Science and ICT (MSIT), the project could enhance the immigration reviewing processes. However, he agrees the methods used to achieve this goal need to be reexamined.
Harvesting facial biometric data remains a contentious topic, given the sensitive personal nature of the information and how easily it can be abused, in addition, collection channels are difficult to monitor. This data handling breach reiterated existing concerns regarding AI, and has prompted new discussion amongst privacy advocates.
Surveillance cameras at every corner
According to the Immigration Act, the MOJ may not collect or store biometric data obtained during immigration – including facial photographs of Korean travelers. Despite this, fingerprints and facial images of travelers that use the automated immigration clearance system have been stored regularly since 2008 for identification purposes. The MOJ claimed it was this electronic information that they transferred to the MSIT, who then passed it to the AI project developers.
These claims could've carried some weight if it hadn't been for 88 cameras installed in the immigration area by the MOJ in 2020 alone. This totaled up to:
- 55 fixed facial recognition cameras
- 26 omnidirectional cameras (with objectives on all four sides)
- 12 rotating cameras
In June this year, The National IT Industry Promotion Agency (NIPA) drafted a new surveillance plan with the support of MSIT. The plan suggested the addition of another 100 cameras this year, aiming for a total of 400 cameras by the end of 2022. Ostensibly, this new recognition system will recognize unusual facial expressions and postures, as well as any suspicious behaviors.
In contrast to the current one-to-one matching system, which compares the person in front of the immigration camera to the photo in their passport, the new program could compare one-to-many images in search of irregularities.
The NIPA's proposal goes as far as involving "location tracking and photography of the faces of people arriving and departing" in their project.
Digital rights of millions severely violated
Of the 120 million images of foreign travelers, 100 million became the input for "AI learning" and the remaining 20 million were used for "algorithm testing". Domestic travelers were also involved in the project, with over 57 million facial images contributed to the creation of an AI identification and tracking system.
South Korea's Personal Information Protection Act characterizes any information that can help identify a specific person, whether through their physical, psychological, or behavioral traits, as "sensitive information". Such information can be shared with and processed by a third party only with the subject's special consent.
The MOJ confessed that: "The consent of the subjects of the information was not obtained", however, it also stated that, even though the facial images have ended up in hands of the private AI developers, this data transfer was part of the same customer clearance improvement project.
South Korea's Personal Information Protection Act allows data usage without explicit consent "within the scope reasonably related to the initial purpose of the collection". Many civic groups and activists are rightly concerned that, in their attempt to create a more advanced traveler identification system, the MOI and MSIT have infringed upon the protection Act.
The Ministry of Justice promise
The S. Korean minister of Justice Park Beom-kye declared that he had not been aware of the now infamous data distribution, but in light of recent events, will do everything in his power to "run the project within its minimum scope to prevent abuse in the use of personal information".
In order to answer the lawmaker Park Jo-min's question, whether Park Beom-kye was aware that the of the ministry's decision to share travelers' facial data with the AI company, the minister promised he will "carefully and thoroughly look into concerns that [the data] might be misused or abused".
Major S. Korean privacy advocates, such as MINBYUN and Jinbonet, argued that the Personal Information Protection Act was clearly already violated and that they are planning to launch a lawsuit in defense of both Korean and foreign victims of this project.
The director of the Institute for Digital Rights, Chang Yeo-Kyung, called this a "shocking event" and a "precedent", as no examples of similar violation exist within the digital history of immigration. The Minister of Justice stated that:
The project itself cannot be retracted or canceled because it was initiated through a memorandum of understanding with the Ministry of Science and ICT.
Reassessing the data collection strategies
Even though the government announced the AI-assisted project in mid-2019, no details were disclosed about the project structure, execution strategies, or data collection methods at the time – and no additional details have been revealed since.
This gave the MOJ and NIPA enough time and space to operate from within the gray areas of S. Korea's Personal Information Protection Act's. Claiming that the AI identification system was not developed as a part of a separate project has, once again, given the government ample opportunity to reassess their data collection and transfer strategies.
It's still unknown how the incident in S. Korea will resolve, but this data violation has made it clear:
countless AI projects around the globe are making it more and more difficult to protect our basic rights to digital privacy.