ProPrivacy is reader supported and sometimes receives a commission when you make purchases using links on this site.

Privacy in 2021 - A year in review

For this year's Privacy In Review, we're taking a look back at some of the most important privacy related news from the past twelve months. We've identified one story from each month in the year that you should be aware of if you want to keep up to date with 2021's privacy issues.

Without further ado, let's get into it!

 

January

Police are using your car's data to prosecute you

We kick off January with an important question: How much data does your car store about you? The answer is much more than you think. Michigan police have found success in cracking cold cases by turning to the infotainment and telematics logs in suspects' cars for evidence. Increasingly, law enforcement agencies are using the treasure trove of user analytics these systems provide to piece together cases against defendants – because when it comes to data protection, cars are low hanging fruit.

Smartphones already capture vast amounts of telemetric and user data, but several factors make cars more attractive to LEOs: Firstly, many new metrics to analyze. Obviously, your car will have a GPS log of locations and dates, but other service metrics will be captured internally by the car, such as car doors opening, weight distribution inside the car, time and location lights were switched on at, and so on. Police can use these factors as the basis of a case.

Secondly, ease of acquisition. Most cars still have relatively lackluster internal security compared to a personal computing devices and often all that is needed to begin collecting data is physical access to the car. If a smartphone has been connected to the car, law enforcement can also access call logs and voicemails as well as other metadata sent between the two devices.

Although a warrant is necessary to begin this process, the sheer amount of personal data processed through your car should make you take heed of your manufacturer's software and hardware policies.

February

Minneapolis police deployed a geo-fencing warrant asking for data, held by Google, regarding devices belonging to protesters

For February, we take a different view on the intersection between privacy and law enforcement with news that Minneapolis police successfully compelled Google to release the account data of any user account who could be placed at the scene of a riot on May 27 in south Minneapolis. After being served a "Geo-fence" search warrant, Google handed over the records of anyone who had location history turned on inside the area marked by the search warrant between 5:20pm and 5:40pm.

The indiscriminate use of geo-fencing to justify surveillance on innocent bystanders is not new, but without concrete numbers and transparency from either the officials involved or Google themselves, it is hard to quantify exactly how often this takes place.

Later this year, Google would be caught in the news again, this time for providing investigators in Wisconsin with the results of a "keyword" search warrant, which turned over the account information for anyone who had made a specific query between two dates. In this case, it was the name of a missing minor that police were investigating the disappearance of. As this only came to light after an accidental temporary unsealing of the warrant, Google is reticent to discuss the case and did not reveal how many users had their information handed over as a result.

March

China want 100% coverage on their Sharp Eyes program soon

For March, we head to the other side of the planet. China has one of the most highly-surveilled societies in the world. Often, this evokes images CCTV on every street corner and CCP enforcers ready to stamp out unwanted behavior at a moment's notice, but the reality involves much more decentralization and community consent. The Sharp Eyes program has been running since 2015 and consists of a series of measures taken to improve mutual surveillance at the local level.

Sharp Eyes is designed to enhance Chinese surveillance capabilities by installing CCTV systems across China, especially in isolated rural towns and villages. Although police and facial recognition algorithms process this data, local citizens are also able to view the CCTV feed in their area and report suspicious activity to the authorities. This is part of a much broader system of interconnected surveillance filters. China's metropolis management systems integrate data about traffic management, personal data, and public security to create a data driven response to conduct the Chinese authorities wish to encourage (or discourage).

What is unique about the Sharp Eyes program is the reliance on communal consent to use and enforce the mechanics of the system: in large, Sharp Eyes would not be as useful if the rural populace was not enthusiastic about using it.

 

April

Tim Cook and Mark Zuckerberg differ over third-party app data use

April brings us back to eve of Apple's rollout of App Tracking Transparency framework for iOS. Just a week before deployment, a story broke by The New York Times detailing a tense meeting between Apple CEO Tim Cook and Facebook founder Mark Zuckerberg that took place shortly after the Cambridge Analytica scandal in 2019.

During this meeting, Cook pressured Zuckerberg to scale back third-party data collection and delete any data collected outside of the core Facebook ecosystem, suggesting that it was Facebook's overzealous data collection policies that had sparked the Cambridge Analytica scandal in the first place. Zuckerberg, somewhat understandably given Facebook's core model of selling your data, declined.

Cook's pro-user stance in conversation with Zuckerberg reflects a wider effort at Apple to protect user privacy. Apple's App Tracking Transparency changes mean that any app installed on iOS needs explicit user permission to access the iPhone's unique ad tracking ID, giving users a greater degree of freedom to choose which (if any) third parties are trustworthy enough to handle their data.

May

Massachusetts passes state law tightening police requirements to use facial recognition

May brings somewhat heartening news: As personal data processing becomes a more integral part of police enforcement, the laws surrounding its usage are beginning to mature. Massachusetts passed a state law requiring a court order before investigators can access facial recognition databases held by the FBI, Registry of Motor Vehicles, or by the Massachusetts State Police.

Previously, police in Massachusetts could simply email the Registry of Motor Vehicles a picture of a suspect and trigger a state-wide cross-reference against facial recognition databases without any extra judicial process. With such lack of oversight, it's easy to imagine the abuse of power a system like this could enable.

Additionally, Massachusetts law enforcement will be required to document their searches and publish relevant statistics to the public.

Although not quite the "warrant only" requirement pushed by lawmakers and the American Civil Liberties Union, this legislation marks an important step on the road towards sensible privacy policies in law enforcement.

June

UK chief data protection officer warns against reckless use of facial recognition

In June, the UK's chief data protection regulator published an opinion from the Information Commissioner's Office outlining the issues with live facial recognition as a practice. Throughout the opinion, Elizabeth Denham points out that many of the systems proposed to the ICO that integrate facial recognition technology do so in a way that is deeply antithetical to privacy and thus violate data protection law.

Many of the proposed usages for facial recognition involve personalized marketing in public spaces and thus involves processing huge amounts of deeply personal biometric data without consent. The speed and scale at which this data is collected without transparency is deeply troubling to the ICO. Furthermore, the same issues of biased recognition and false positives/negatives that have plagued law enforcement's use of facial recognition crop up here, too.

Although the UK is still broadly compliant with GDPR thanks to a data adequacy agreement with the EU that enables each entity to process customer data from the other, we may yet see the UK shift away from the EU's pro-privacy stance. In the meantime, the ICO's opinions are welcome, if a worrying indicator of the current state of biometric data processing.

July

Spyware sold to law enforcement agencies used by foreign governments to spy on journalists

In July, a consortium of journalist groups lead by Amnesty International and Forbidden Stories, a French media non-profit, delivered a story to the public unlike any other: the Pegasus Project. The result of 17 different media organizations working under a single umbrella to cover a single story, the Pegasus Project details the abuse of a phone spyware kit developed by the Israeli cyber-arms company, NSO Group, to spy on hundreds of journalists, lawyers, diplomats, and heads of state.

The spyware in question, Pegasus, could be remotely installed without user interaction and completely compromises the targeted phone, not only stealing all the data on the device but also turning it into a constantly eavesdropping spy. Pegasus was sold by the NSO Group to foreign law enforcement groups on the understanding it would be used for targeted surveillance against wanted criminals, but in practice it's become apparent that the software has been used as a tool by repressive nations to spy on their citizens. Forensic work by experts associated with Forbidden Stories confirmed that Pegasus had targeted many opposition leaders in India, Saudi Arabia, and the UAE, including Jamal Khashoggi.

August

Apple announces iPhone gallery scanning updates

In August, Apple made a wild u-turn on all the good noise they've been making about user privacy by announcing a forthcoming iOS update that will scan your phone's internal photo libraries. Many cloud hosting services, including Google, Facebook, and even iCloud itself already do this to scan for child sexual abuse material. However, Apple's plans represent an intrusion upon the privacy domain of your phone, blurring the line between the iCloud and your iPhone.

Apple have made a name for themselves in the last few years as a technology company that prioritizes security and digital privacy, their crown jewel being the iPhone and iOS. Anything that damages this perception is deeply hurtful to their brand, which makes this decision all the more confusing. Not only does the system have questionable utility in impeding child predators, but could be actively harmful if a malicious third party were to send otherwise safe images that trip the NeuralHash filters.

Ultimately, Apple decided against implementing this feature after it became apparent that the move was deeply unpopular.

September

Australia trial facial recognition software to enforce covid rules in two states

September brings us to Australia. Australia's response to coronavirus has often gone beyond measures commonly seen in the US and Europe, with some of the most stringent lockdown and border control policies in the world. At the very height of Australian lockdown, facial recognition software was mooted as a potential "next-step" in the country's surveillance regime.

The trial scheme involved downloading an app that would randomly post check-in requests, requiring the user to take a geo-tagged "selfie" at their quarantine address. If the location or image data don't match up, the police are called to the address. Understandably, this scheme immediately attracted dismay from wide sections of the Australian population, with many voicing concerns about violations of liberty and the normalisation of facial recognition technology.

October

Europe passes non-binding resolution to block facial recognition for law enforcement

In October, the European Parliament passed a non-binding resolution banning law enforcement from using automated analysis of biometric and behavioral signals. The resolution calls for a ban on facial recognition software, as well as any AI-driven biometrics system that enables mass surveillance of public spaces.

While member states are not legally obliged to enforce the resolution, the attached report comprehensively breaks down the applications of AI-driven software in law enforcement and recommends heavily against implementing black box AI systems. This is instead of calling for greater transparency in AI development and erring on the side of caution, where biases and prejudices evident in the software cannot be easily identified and evaluated.

November

Facebook shuts down facial recognition system

In November, Facebook announced they were bringing an end to their facial recognition software scheme, deleting over a billion facial profiles that the social media giant has collected since launching facial recognition for photo-tagging in 2010. Sun-setting this program shows a keen awareness at Facebook about the perception of facial recognition as a means of surveillance.

While the Metaverse project may continue to build upon facial recognition technology used by Meta, Facebook's parent company, it is unlikely that it will take any form that is immediately obvious to the consumer.

December

The US Government wins their appeal to extradite Julian Assange

Finally, in December we come to the case of Julian Assange. Assange has long been wanted by the US authorities to stand trial over the publication of military documents and diplomatic cables through WikiLeaks, an international news leak organisation founded by Assange. After spending a decade seeking asylum in the Ecuadorian embassy in London from extradition to Sweden over rape allegations, Assange was evicted from the embassy when the Swedish case was dismissed in 2019 and subsequently arrested by UK police for skipping bail.

Now, the US has won their appeal to extradite Julian Assange in London's High Court only a few weeks ago. Human rights groups such as Amnesty International describe the charges as "politically motivated" and warn that the US's assurances of fair treatment for Assange are inherently unreliable and should not be taken at face value. Assange's fiance intends to lead an appeal against the decision as soon as possible.

Fun bonus fact:

Did you know Assange was considered so valuable to the US that contingency plans for his retrieval included a capture-or-kill shootout on the streets of London if Russian intelligence made a run at him first? Read more about it here!

What should be our takeaways from this year?

It's been a rocky year for privacy. As the news we have covered this year illustrates, it's incredibly difficult to maintain any notion of privacy with attacks from every angle. If it isn't the police demanding a geo-tagged selfie or they'll haul you off to jail, it's a private corporation selling zero-day exploits to invisibly hack your phone. Maybe it's your phone manufacturer spying on the contents of your phone themselves. Maybe it's your local retailer scanning your eyeballs on your way to return a pair of jeans and being bombarded with "Happy Birthday!" discounts. A few years ago, that last sentence would've been a playful jab at Minority Report's vision of a dystopian future instead of a concern voiced by the UK's chief data controller.

The future is here, and computers are everywhere. If there's one point you should take away from 2021, it's this: your data is everywhere, too. Every time you interact with a computer system, knowingly or not, you are leaving some sort of evidence of your passing. In the modern world, personal data is a trail you leave behind everywhere you go.

However, in some ways, the evolution of digital privacy this year is a promising one. Social media giants like Facebook are coming to terms with the fact that the honeymoon period is over. People know that they have data, that they are data, and this data is valuable. Users are demanding greater authority over their data. What's more, they're starting to spot a bad deal when they see one. Although data monetization apps are seeking to redress the balance by paying users to generate data, most of these attempts are a new spin on the old "Take a survey, get paid!" scheme.

Rather, social media users expect certain conventions to not be violated, such as responsible data handling when dealing with third parties. The Cambridge Analytica scandal showed that Facebook was violating this core tenet, causing a confidence crisis the social media outlet has never quite recovered from. Facebook is shrinking as a platform among younger users, who prefer Instagram, TikTok, and Twitter. While all of these platforms operate off of a similar farm data/sell ads revenue model, Facebook's continual abuses of data and algorithms that drive engagement at all costs means more people are starting to see Facebook as a raw deal.

The slow pace of regulation in the biometrics arena is somewhat frustrating, but not unexpected.

Legislation is inevitably slower than technological innovation. We cannot be surprised that governments are only just starting to catch up to the implications of technology that has been available for over half a decade. Facial recognition technology is here to stay, and advances in the speed and accuracy of these technologies occur daily. That said, we are still a far way off from a point at which facial recognition is bias free.

The technical details of AI-driven biometrics belie an even deeper question: Do we even want it? Even with the most accurate techniques, the most infallible accuracy, does the average person want to live in a society in which they are constantly under watch? In which the most personal aspects of their lives are freely given away the moment they step in front of a camera? Attitudes towards indiscriminate surveillance vary. While China has seemed to embrace all aspects of a highly connected society as the logical next step in city-planning and development, Europe has signaled that putting AI at the heart of data processing for law enforcement is a violation of human dignity and fair process.

Where next for 2022?

Going into 2022, it wouldn't be surprising to see more focus on letting users pick and choose the data they give away. Apple have already established a policy update in iOS for doing this and will most likely iterate several times more on the idea until they get it right. Europe's General Data Protection Regulation was adopted with this goal in mind, but its execution of cookie-consent has been less than stellar.

The most visible effect GDPR has on most web users is to clutter their browser with cookie acceptance prompts most users won't take the time or effort to customize. At worst, now users are presented with a pop-up politely informing them that by browsing the site in question they have already consented to cookies on their browser! This is no better than the way the internet used to work. In fact, it's slightly worse.

That doesn't even begin to take into account the difficulties entities outside the EU have with GDPR compliance. In response, California has already produced a privacy policy similar to the GDPR and other states are expected to follow suit this year. The EU is aware of the problems of cookie-consent as it is implemented, and it would be interesting to see a future solution that integrates cookie-consent into a broader profile of data protection established by the user.

Final thoughts

To conclude, 2021 has been a mixed bag. It's very easy to hyper-focus on the stories that highlight the worst privacy abuses, but there's enough evidence out there to suggest the wild west era for major data controllers is well and truly over... Let's hope.

Written by: Sam Dawson

Sam is a cybersecurity researcher currently pursuing a PhD at the University of Kent. He has spent the last few years contemplating some of the ways in which computers have been built wrong, and how we can possibly detect those flaws. When not stressing out over arcane details of computer hardware, Sam enjoys wandering the south-east coast looking for new challengers in Street Fighter.

0 Comments

There are no comments yet.

Write Your Own Comment

Your comment has been sent to the queue. It will appear shortly.

Your comment has been sent to the queue. It will appear shortly.

Your comment has been sent to the queue. It will appear shortly.

  Your comment has been sent to the queue. It will appear shortly.

We recommend you check out one of these alternatives:

The fastest VPN we test, unblocks everything, with amazing service all round

A large brand offering great value at a cheap price

One of the largest VPNs, voted best VPN by Reddit

One of the cheapest VPNs out there, but an incredibly good service