If you haven’t already, you might soon find yourself reading more headlines about “privacy concerns” in the coming weeks as the COVID-19 crisis continues. That’s because many governments are implementing emergency measures — at the cost of our privacy — for public health interventions through a tried-and-true method called contact tracing.
Contact tracing, which has been described as old school detective work, is the crucial process of tracing interactions with COVID-19 and preventing its future spread as much as possible. It does so by tracking people who have tested positive and by using their location history to identify others who were also in close proximity or who have potentially been exposed to the virus. While incredibly successful and important for curbing the number of new cases, digital contact tracing requires authorities to peer into our private lives by using our cell phones and GPS/location data. See the problem? This method inherently revolves around a tradeoff between public health and protecting privacy. Saving lives obviously tips the scale.
This dilemma is not necessarily new, however. Your fundamental right to privacy is internationally recognized and universally declared. And yet, as technology continues to advance, this right is frequently engaged in a tradeoff in our daily routines: the privacy we are entitled to versus the convenience of technology on which we rely. How many times have we given up our location data or health stats to Google Maps or our Apple Watch?
The use of technology as part of the COVID-19 response is central to the conversation about the preservation of civil liberties during a global health emergency. This article will examine the ways that different governments and corporations are using or plan to use private data to track disease, and the various implications this may have on privacy and security.
China
“I thought the days when humans are ruled by machines and algorithms won’t happen for at least another 50 years. [But] this coronavirus epidemic has suddenly brought it on early,” a blogger writes on the popular Chinese forum, Zhihu. Although somewhat dystopian, this account of aggressive scrutiny and surveillance occurring in China does indeed depict reality for Chinese residents. China is one of the many East Asian countries that has used contact tracing successfully, but not without its drawbacks.
Following the outbreak of COVID-19, the Chinese government, along with internet company Alipay, launched an app called Health Code on February 11, through which users report their personal information and symptoms (if any). Based on their location, health information and travel history, Health Code assesses each individual’s risk of COVID-19. It also collects users’ national identity numbers, passport numbers, home addresses, potential past interactions with COVID-19 and more. Once this data is verified by authorities, residents are given colour-coded QR codes, which dictate whether they are able to work, travel and/or go out in public spaces. It’s business as usual for those who receive a green code, but yellow and red codes indicate the need for self-quarantine.
An example of a green QR code given to a user on the Health Code app. (Image Source: The New York Times)
Residents must display their QR code before entering a public place or using public transit. People scan a QR code on their phones while volunteers check their temperatures before entering a market in Kunming, in China’s southern Yunnan Province. (Image Source: The New York Times)
This app has been instrumental for contact tracing. After its launch in February, more than 300 cities were using Health Code with approximately 700 million users nationwide. Given the widespread use of this system, many privacy experts and regular citizens alike have been questioning the privacy controls surrounding this data collection. While many cities have removed the mandatory QR code pass following the decline of new cases, and despite reassurances that this data is single-purpose, there have been some reports that Health Code does more than its publicly-stated purpose.
Health Code’s coding instructions, which tell the app what to do with the data collected, may actually go beyond the public interest. In fact, Health Code (the app itself) transmits the data mentioned above directly to law enforcement authorities, which is a questionable extension of such “automated social control” and also sets a dangerous precedent. What purpose would the police have with this information besides enforcement of quarantine measures? How long after the pandemic subsides is such data retained? How is this data analyzed and which third parties are privy to it? And most importantly, are these measures being left unchallenged and unchecked?
Data is rarely single-purpose, which means there is a high chance that even after the pandemic has curbed, it can be used to different, more insidious ends; in Hong Kong, for example, facial recognition software has often been used to single out protestors. The adoption of similarly ubiquitous surveillance methods has two sobering implications: (1) the loss of privacy among individuals and (2) the loss of accountability for central authorities. Without any independent oversight, governments can weaponize this technology to implement draconian laws against political dissidents and minorities. For Muslim minority groups living in Xinjiang, this is a grievous reality.
Maya Wang, a China Senior Researcher for Human Rights Watch participated in an interview on May 1, 2019, entitled “China’s ‘Big Brother’ App.” In this interview, she details the unprecedented level of illegal surveillance and mass arbitrary detention faced by Uyghur and Turkik Muslims in Xinjiang — all made possible through AI surveillance companies, SenseTime, Megvii and Yitu, and apps like WeChat, which track users’ location and transmit this data to law enforcement.
SenseTime is one of several AI companies used by China for facial recognition surveillance. (Image Source: The New York Times)
While the specific implications of Health Code on Chinese civil liberties remain largely unclear at this point, the existence of widespread surveillance systems will only grow after COVID-19. If we are to preserve some semblance of our right to privacy, government oversight must be institutionalized and apps like Health Code must be closely monitored by human rights groups.
South Korea
South Korea has been credited with having some of the most success in dealing with COVID-19 due to immediate action taken by the government. The availability of widespread testing and adoption of an intense infectious disease law have been the driving forces behind this success. This law allows authorized officials to collect dossiers on positively-tested patients that include cell phone data, credit card and banking transactions and closed-circuit television (CCTV)/security footage.
This data is not restricted to use by health organizations and/or authorities alone. In fact, the South Korean government publishes such data via local and national government websites. It also frequently sends out text alerts, which include detailed reports on patients’ ages, personal details, where they have been and more. Due to South Korea’s “smart city infrastructure,” patients can easily be found and tested.
Examples of the text alerts South Koreans get, alerting them to new cases. These alerts typically detail the age of the patient, when, where and how they contracted the virus, and their location. (Image Source: BBC)
This overexposure of information has led to greater public trust in South Korean institutions, which is interestingly evidenced by the lack of panic buying in grocery stores; if people have confidence in their governments, they will likely feel safer, and thus, avoid panicking. The meticulous spread of information via text alerts or online websites serves as a diligent reminder that the government is ‘on it’ and that all bases are covered. A high level of public trust is surprising in this situation because the public distribution of private information doesn’t typically fare well in a liberal democracy — simply put, most people usually aren’t comfortable with (a) giving up their private data and (b) having that data broadcasted.
A greater level of public trust, however, doesn’t necessarily mean that privacy concerns have not been frequently expressed in South Korea. While incredibly helpful in curbing and containing the number of cases, this dissemination of intrusive, minute-by-minute logs of an individual’s whereabouts can have unsought consequences. One Korean woman claims that many of the personal details that were publicized easily revealed her identity; she feared for herself and her family while feeling the stigma and shame attached to COVID-19. Several others have faced malicious comments online after their identities were put together by the public. Some of the most drastic and humiliating revelations however, have included the public exposure of extramarital affairs due to a string of emergency alerts sent by the government.
Another example of how real-time text alerts appear on phones. (Image Source: Smart Cities World)
Due to the pressing nature of COVID-19, technologists are stepping up with solutions in more aggressive (but necessary) ways at a much faster pace. Though South Korea’s response may be emulated by North American governments, the side-effects of public distribution (i.e. feelings of shame and the fear of being judged) may dampen the established public trust. People may begin to feel alienated and less inclined to cooperate, a factor that has largely been responsible for the effectiveness of technology-based contact tracing.
As Mr. Goh, from the Korea Centers for Disease Control Prevention says to BBC, this level of public sharing is unprecedented, and indeed, “after the spread of the virus ends,” he says, “there has to be society’s assessment whether or not this was effective and appropriate.” Technology has proven to play a critical role in saving lives, but its effectiveness comes at the expense of an individual’s privacy — a necessary bargain, but one that must also be carefully scrutinized.
Apple-Google Partnership
Apple and Google partner on COVID-19 contact tracing tech. (Image Source: Apple)
Perhaps the most interesting collaboration that has emerged as a result of COVID-19 is one between two tech rivals, Apple and Google. On April 10, 2020, Apple and Google announced their joint effort to help combat COVID-19 and save as many lives as possible. Their proposal involves the use of Bluetooth technology to conduct contact tracing. Unlike GPS tracking, which would store raw data on a server, Bluetooth tracking would only transmit random codes called ‘keys’ between individual phones. While a server is used as a medium to transfer information, bluetooth technology essentially eliminates the middleman. Instead, this technology would exchange random, anonymous keys between phones that come within close proximity. Should a person become infected and choose to disclose that to the monitoring app, others who interacted with said person will be notified about the potential exposure. As Recode describes it, this technology “works a bit like exchanging contact information with everyone you meet, except everything is designed to be anonymous and automatic.”
Additionally, both companies have claimed the software will be designed with particular care and consideration to protect user privacy and security. Apple, in particular, has a demonstrated history and commitment to user privacy, and it has frequently butt heads with the law to demonstrate this commitment. After the 2015 San Bernandino terrorist attacks, Tim Cook, CEO of Apple, refused to unlock the suspect’s iPhone for FBI investigation, which led to the FBI-Apple encryption dispute. And Cook won this dispute, citing the “chilling” implications of opening a backdoor to the U.S. government. This example is indicative of just how seriously Apple takes user privacy, especially if it takes precedence over a terrorist investigation.
A note supporting Apple hangs on the window of a San Francisco Apple Store. (Image Source: CNET)
There are also certain features that make the software above different from the tech used in China or South Korea. Apple and Google have promised to champion privacy, transparency and consent, so explicit user consent is required, meaning this technology would be completely voluntary. Furthermore, and perhaps most importantly, the software will not collect personal information, such as an individual’s contacts or location data. People who test positive will not be identifiable, and all of this information will only be available to public health authorities for the sole purpose of COVID-19 containment.
While such innovation is commendable and necessary, there are still a myriad of ways this data could be misused by hackers or third-parties that will try to de-anonymize the random keys. This has been done in the past with previous iOS exploits, which are software tools that maliciously take advantage of flaws in a computer system; for instance, the 2019 Insomnia exploit was used by a state-sponsored hacking unit in China to spy on Uyghur Muslims through their iPhones.
As aforementioned, trust is an integral part of our societies and plays a critical role in our institutions, and it is not easily earned or given. I am still somewhat distrustful of promises of transparency, privacy or consent from profit-driven companies. Google, in particular, has a proven track record of frequently leveraging pools of data for commercial and marketing purposes. Where is the guarantee that private, third parties will remain committed to transparency, privacy and consent? The potential for public good may be undermined, especially if data is retained post-crisis and if the future deletion of such data is uncertain.
Another important factor for consideration is the efficacy of using tools that are entirely participatory. The Chinese model worked because adoption of digital contact tracing was government-mandated; you had to be able to present a green QR code at various checkpoints. The South Korean model worked because South Koreans tend to be very socially conscious and have demonstrated high levels of compliance and cooperation. How effective is the software if it is only downloaded by say, 5-20% of the population?
A woman looks at the TraceTogether app used in Singapore. TraceTogether uses Bluetooth technology as well in an effort to maintain user privacy. (Image Source: The Newpaper)
Singapore, for example, launched a voluntary, opt-in app called TraceTogether. Only ⅙ of the population downloaded it. Clearly, there is an issue with implementation and widespread adoption when the tech is opt-in. If most of the population does not download the app, it would become harder for the software to be truly effective in curbing new cases and containing COVID-19. If consent is centrefold to Apple and Google’s Bluetooth software, will it really be effective?
No Time to be Complacent
The new normal. (Image Source: NBC News)
These are important questions that must be posed during this time. As demonstrated by China and South Korea, contact tracing is important to the containment of COVID-19 and it has been incredibly helpful — there is no getting around that. Unfortunately, we don’t have a silver bullet that provides us with the perfect answer as to how society should navigate this unprecedented situation. We don’t have the luxury of time to conduct lengthy studies or to make long-term assessments regarding the efficacy of contact tracing technology. Technology itself is a policy area that is, for the most part, an immature regulatory space.
There is also no time to be complacent, however. One thing is certain: the unregulated and unrestrained use of technology to track COVID-19 is a slippery slope. The careful management of all similar endeavours across the globe, where our right to privacy is limited, requires watchful and vigilant eyes from civil organizations and the general public alike. Accountability and transparency are tantamount, especially post-crisis, when our governments and corporations may be tempted to continue to use our private data.