Skip to content Skip to navigation

Examining Community Disquiet in the Age of COVID-19

26 Nov 2020

 

Alicia Wee and Mark Findlay chart the different sources of disquiet generated from the extensive use of AI-assisted surveillance technology during the age of COVID-19, highlighting potential and existing challenges from an individual’s perspective impacting control efficacy , and propose for greater citizen inclusion to rebuild trust in technology and the sponsoring governing authorities.

 

In a recent paper, AI and Data Use: Surveillance Technology and Community Disquiet in the Age of COVID-19, we examined a wide ranging sources of social responses to the different control measures as the use of surveillance technology escalated use worldwide..

The concerns voiced by citizens underscore their worries about infringements of their rights, liberties and data integrity. In our paper, we categorised these concerns under the following themes:

  1. Disquiet about the data collected;
  2. Disquiet concerning authority styles confirming control responses;
  3. Disquiet regarding the integral architecture of control strategies employed;
  4. Disquiet surrounding infringement of rights and liberties;
  5. Disquiet surrounding the role of private sector; as well as
  6. Uncertainties regarding a post-pandemic world and its “new normal”.

Ultimately, the research revealed that the resulting distrust of both the technology used and the authorities behind them have a pronounced effect on the tech's utility, accuracy and efficacy: for tech that operated on a by-consent model, there have been multiple reports of low take-up rates globally. On the other hand, mandatory or compulsory measures have brought about citizen resistance, including protests and illegal hardware tampering.

 

Concerns identified

 

#1 Disquiet about the Data Collected

 

Pandemic control data collection extends beyond contact tracing apps into more invasive forms of tracing measures, including: surveillance monitoring technology such as CCTVs, electronic tagging wristbands, temperature sensors, drones, etc. A common and prevailing anxiety voiced by citizens across the states and communities surveyed centres on key questions of data integrity and personal protection - what forms of data are being stored, whether the mass amounts of data collected are stored appropriately, who can use and own the data collected, how long the data will be retained, and what sharing of sensitive health data is occurring? Some of these concerns, both perceived and real, reflect that data subjects are not fully informed about the protective purposes for surveillance, and the consequent risks these technologies may pose.

In Australia, the hybrid centralised/decentralised approach towards data collection has drawn criticisms from data subjects who are unconvinced that there is adequate protection of personal health data. Public discourse recalls violations of centralised databases as recent as 2016, during which the Australian government lost 2.9 million Australians’ sensitive medical records due to “pure technical naivete”. This year alone, there have been reported instances of hackers gaining access to, and leaking sensitive COVID-19 records, detailing more than 400 pages of communications and messages between health officials and doctors. Most recently, it has been brought to light that Australia's COVIDSafe data has been 'incidentally' collected by intelligence agencies during the first six months of the app's operation. While none of the incidental data has been decrypted, accessed or used, the lingering perception of data exploitation still remains.

In a recent study conducted by the Institute of Policy Studies (IPS), a significant portion of respondents in Singapore agreed to have their phone data tracked for contact tracing purposes, which suggests a willingness to sacrifice their privacy (to a degree) in order to resume their daily lives as soon as possible. Recognising the responsibility which should attach to such significant levels of public support, IPS warned that any ongoing forms of large-scale government-sanctioned surveillance programmes will inevitably raise questions about data protection and individual liberties that must be addressed by government and other data sharers, (i.e., how sensitive personal data will be used, who has its access, and whether private companies will be allowed to utilise and exploit it in the future for commercial, non-pandemic related purposes).

 

#2 Disquiet concerning authority styles confirming control responses

 

The wake of COVID-19 has also seen a surge of authoritarian-like control strategies being adopted.

Most recently, the Singapore government announced a pilot programme combining the use of SafeEntry and TraceTogether data to improve the contact tracing process. SafeEntry, an island-wide mandated digital check-in system that logs data subjects’ visited locations, relies on location records. On the other hand, the government has repeatedly emphasised that TraceTogether is privacy-centric, processing anonymised proximity data and not geolocation indicators to assist in contact tracing efforts. Originally, given the voluntary nature of TraceTogether, it was not necessary for data subjects to use both SafeEntry and TraceTogether, although they have been encouraged to do so. However, from October 2020, data subjects participating in larger events such as meetings, incentives, conferences and exhibitions (MICE) are required to use only the TraceTogether app in order to log a SafeEntry check-in. A detailed commentary on measures for reopening of activities can be accessed here.

In our paper, we suggested that this conflation of technology and purpose could be seen as compromising the voluntary status of the TraceTogether app, while also suggesting that authorities will be using both location and proximity data to monitor data subjects – heightening the identification capacities of such control strategies. This change in the conditions of citizen compliance, from overseas experience and domestic sentiment, may raise suspicions amongst its users and qualify citizen self-determination with regards to their app use and data sharing. The lower-than-necessitated uptake of TraceTogether explains this development but strains on trust because of compulsory application is likely (from the experience of the study) to also diminish citizen cooperation. These concerns have been realised in a recent press briefing on 20 October 2020, where the multi-ministry task force tackling COVID-19 declared that TraceTogether will be made mandatory by December 2020. In addition, by making a 70% take-up rate of TraceTogether a condition for re-opening up the country, this confirms the policy transition to more stringent surveillance, diminishing citizen self-determination.

Faced with risks of privacy infringements and data breaches, it should be remembered that data subjects do not always possess suitable remedies if such occur in pursuit of surveillance-centred control. For instance, Australians do not have universal personal legal recourse to address privacy infringements by the public databases, considering that fundamental privacy rights are not accorded by Australia’s constitution, treaty obligations, or even common law. Legal mechanisms to safeguard the use and retention of data collected, not grounded in unambiguous rights of privacy, have proven sufficient to combat or deter potential data misuse that could more likely occur through the prolonged retention of app data beyond its prescribed promised purposes.

Instead, data subjects have expressed their discontent towards control measures both online and in person. In Singapore, data subjects have resorted to tampering TraceTogether tokens by removing the battery, or swopping QR codes with other devices. In other parts of the world, sections of the Australian population public sought to counter the government’s control responses through nationwide protests against lockdown measures. Hundreds of anti-lockdown protestors gathered together during “Freedom Day” rallies, chanting “freedom” and “human rights matter”, opposing restrictions of personal movement and association. In Indonesia, an open letter collated by 13 human rights organisations was transmitted to the Indonesian Minister of Communication and Information Technology requesting strong user privacy protections for the PeduliLindungi app that raised questions over safety of the storage of personal data on smartphones.

 

#3 Disquiet regarding the integral architecture of control strategies employed

 

Doubts have arisen among data subjects from tech sponsors' repeated overselling and overpromising of the privacy-protection capacities of technologies, particularly those operating via Bluetooth. Overselling the capacities of such technologies in these instances, paired with a wider public misunderstanding of the capabilities and limits of current technologies, generates distrust, the research established – both in the device and in the authority on which it rests.

Despite the wide-scale state promotion of contact tracing technology, multiple reports have revealed citizens’ reluctance to download the apps, with many expressing apprehensions regarding the technology powering surveillance devices. In China, the Health Code (which users can sign up for via AliPay and WeChat) functions on a green-yellow-red scheme, which operates on a scale indicating to users that they are free to travel; should be in home isolation; or are confirmed to be COVID-19 patients, respectively. Several users have reported that they were unable to rectify erroneous “red” designations which were left uncorrected even after officials were alerted to such a problem, leaving many to question the accuracy of such surveillance and the genuine utility of their related apps.

Much community unease surrounds the surveillance technology itself, exacerbated by the reality that the common user or data subject does not possess the necessary technical understanding of the workings behind apps or appreciate its reach in facilities like facial recognition software functions. To its credit, the Singaporean authorities have sought to bridge the gap in technical knowledge on the operability of COVID technologies. The Government Technology Agency (GovTech) has released a comprehensive white paper outlining the data which TraceTogether is collecting, and the trust-by-design premise that the app is built upon to safeguard privacy. GovTech has also released a shorter piece on its website, “9 geeky myth-busting facts you need to know about TraceTogether”, to address commonly misunderstood aspects of the app in a more accessible manner. These ‘facts’ include express clarifications that the app is not used to track or spy on citizens whereabouts, and that consent to the in-app functions of the phone does not equate to providing the government with unlimited access to all of the user’s personal and phone data. Unfortunately, much like the white paper, this released statement is not easily located within the app’s interface (even within its help section), or on its related website. These efforts, albeit commendable, happened after the technology has been released. Therefore such explanations and justifications of the technology have been described as “mere performances of public participation”, reinforcing the top-down practices of technology sponsors when it comes to efforts of citizen inclusion.

 

#4 Disquiet surrounding infringement of rights and liberties

 

Studies have shown that surveillance has a strong tendency to target racialised people, migrants, and the vulnerable sectors of the labour market, all of whom “bear the burden of heightened policing powers and punitive ‘public health’ enforcement” as they are more likely to have to leave their houses to go to susceptible work environments no matter what the risks. The lived realities of such front-line communities differ from the privileged individuals who are afforded greater privacy in their ability to work from home and socially distance.

In attempts to enforce lockdowns, there are reports regarding disproportionate targeting of ethnic minorities and marginalised groups with violence, unwarranted and unnecessary identity checks, especially in poorer areas of cities. People of colour, indigenous persons and minorities, disproportionately represented in detention and prison populations, where overcrowding serves to catalyse the spread of the virus, are at greater health risk. In urban ghettos, populated on ethnic and racial lines, rates of infection are unequal and intrusive control operations are high. This demographic was witnessed in the migrant housing estates in Melbourne leading to its recent extensive lockdown. The overreach of heightened surveillance powers enable public and private data harvesters to further invade privacy, deter free speech, and disparately discriminate against vulnerable groups in the community. It has been observed that there have been “signs that authoritarian regimes are using COVID-19 as a pretext to suppress independent speech, increase surveillance, and otherwise restrict fundamental rights, going beyond what is justified by public health needs”.

 

#5 Disquiet surrounding the role of private sector

 

Apart from personal disquiet expressed by data subjects who have directly interacted with the surveillance technologies, experts have expressed apprehensions surrounding the concentrated control of computing infrastructure and its implications on the existing power asymmetries between private tech companies and public agencies.

For example, French officials reported that when they had tried to approach Apple and Google with their centralised protocol for contact tracing to see if an accommodation could be reached, they were met with resistance, as only decentralised technologies were allowed. The ability of tech giants like Apple and Google to dictate the kinds of apps they would upload, regardless of the state’s authority, exemplifies the power disparities between tech companies and public agencies even in crisis contexts.

This exercise of private commercial power demonstrates private companies' ability to negotiate into the realm of political responsibility on an international scale. The growing encroachment by technological conglomerates into political and medical spheres is a phenomenon that requires greater attention, especially since the tech giants’ commercial interests may not necessarily overlap with the policy imperatives of political and medical experts.

 

#6 Uncertainties regarding a post-pandemic world and its “new normal”

 

Beyond the immediate threat of the pandemic, another thread of disquiet centres on the long-term political and legislative impacts of enhanced surveillance. The question that arises is whether these surveillance technologies expanded in the pandemic context, will be further normalised as the public becomes less sensitive to privacy infringements and, consequently, less resistant to even greater intrusion in the name of public safety (argued as necessary for an eventual return to a less rights-restricting life).

The longer such AI-assisted surveillance technologies are accessible and proliferate in society; the easier it is to ignore their medium-term reach, and to become resigned to the compromise of rights and liberties, forget the disquiet that emerged in the initial stages of the control responses. Bearing this in mind, there is a responsibility on surveillance technology promoters to build in regulatory protections (ethical compliance in particular) at all stages of implementation and operation. The pressure for such regulatory pre-emption becomes greater in the context of smart city urban planning being so heavily reliant on surveillance tech and mass data sharing.

In the face of intrusive short-term measures, a commonly exercised legislative tool is the introduction of sunset clauses necessitating a return to some power status quo as the pandemic winds down. When the public believes that such invasive measures will eventually discontinue, it may be that more are willing to endure a temporary curtailment of rights and rationalise the surveillance regimes as being a necessary and perhaps proportionate response to resolve the immediate health crisis. However, Montsenok et al. argue that the unintended consequence of sunset clauses creates a termination paradox, as temporary measures, so moderated with expiry dates, may, invariably lead to a proliferation of control policies that would not otherwise have been approved.

 

Effect of Disquiet and Proposal for Greater Citizen Inclusion

 

If negative subject perceptions are not properly, promptly and personally addressed, governments will struggle against an anti-participation culture, leading to dissatisfaction with the performance of the tech, increased unhappiness with surveillance, and even protests and petitions against government responses that require a compromise of liberties and personal data protection. This will potentially become a vicious cycle – apps are distrusted, their efficacy is impeded through lower uptake, virus control outcomes are negative, and the citizen loses faith in the state’s capacity to control the pandemic.

However, it would be incorrect to suggest that distrust is universal, or that it has completely eroded public confidence in control technologies. Many citizens are willing to forgo liberties and rights if their personal safety is enhanced in crisis contexts. Yet the research in our paper reveals that privacy and personal data protection do not need to be discounted for public health maximisation. There is evidence in countries that have based their surveillance technology on principled design, or have improved trust through citizen inclusion, achieve overall better outcomes for all.

So, what of the future? The research suggests that instead of building a permanent surveillance regime as remedy for pandemic threats ongoing, there is still time to “rebuild people’s trust in science, in public authorities and in the media”. By ensuring greater transparency of data, control information and policy details through techniques such as information loops, citizens will be able to monitor their public and private sourced data management and judge for themselves whether the data managers and repositories are adhering to ethical principles and respecting citizens’ interests. With greater civilian inclusion, users can make informed personal choices about what technology they will tolerate and why, and may as a consequence, be more willing to participate in contact tracing activities.

 


Should you be interested to find out more, click here to access the full paper, or feel free to reach out for any comments and suggestions.

 

Alicia Wee is a research associate at the Centre for AI and Data Governance.

Mark Findlay is the director of the Centre for AI and Data Governance.

This research is supported by the National Research Foundation, Singapore under its Emerging Areas Research Projects (EARP) Funding Initiative. Any opinions, findings and conclusions or recommendations expressed in this material are those of the authors and do not reflect the views of National Research Foundation, Singapore.

Many thanks to Josephine Seah for her comments and insights.

Last updated on 26 Apr 2021 .