Ayesha Khanna: Covid-19-Bekämpfung per Handy-App

Principles to Save Lives and Protect Privacy

The fear of losing privacy through digital intrusion is legitimate, but the dichotomy between saving lives and protecting privacy is false. If data and artificial intelligence are strictly governed with transparent ethical guidelines, citizens could temporarily give private data to governments without fearing long-term mass surveillance.

China and most of its neighbors appear to have the pandemic largely under control, while Europe and the US are suffering excessive casualties and economic contraction. What are the lessons others can take away from the Asian experience of using digital technology to contain the virus?

First, let’s be clear: the fear of losing privacy in the digital age is real. However, emergency measures to save lives need not lead to systematic surveillance if proper governance is in place. Since the 9/11 terrorist attacks and subsequent sporadic attacks across Europe, governments have leveraged technologies such as security cameras and GPS tracking in the name of public safety. Americans are still divided on whether the US government has gone too far in surveillance, and policymakers continue to grapple with their responsibility to the collective and the demands of individual rights.

The Coronavirus presents a new opportunity

The Coronavirus presents a new opportunity to build strong criteria and frameworks for the use of personal data for public safety. Consider this scenario: A contact tracing app flags you as having been exposed to a Covid-19 patient. Due to data sharing across government and business, your employment records are flagged, you face additional questioning at borders, and your healthcare premium rises. While you supported the government’s efforts at contact tracing in the interest of your own and the public’s health, you now find yourself a victim of data abuse. This scenario can only be prevented through better governance.

Some Asian governments have effectively used information technology to contain the Covid-19 pandemic. Taiwan has given government-issued mobile phones to monitor compliance with quarantine, South Korea has extensively used mobile phone location data to reduce contact tracing time from 24 hours to ten minutes, and China has embedded its Health Check app in the nearly universal Wechat and Alipay services to flag and notify those who need quarantine. But what happens to the personal data once the pandemic has ebbed?

The Singapore government’s «Trace Together» app presents an alternative way to collect data that is more transparent to citizens. The app, downloadedon a voluntary basis, uses Bluetooth signals to determine if a person’s mobile device has been in close proximity to someone confirmed to be infected. To be nationally effective, the majority of the population has to download the app. The government has been communicative across all media about both the data collection mechanism and the limits of the app: It does not collect location or any other data from the phone, and all logs remain on the phone itself unless a user has been identified as being at-risk. If a user uninstalls the app, she can request the government to delete her identification data from the servers. Importantly, once the pandemic is over, the app will notify users to deactivate its functionality. Such safeguards point to ways in which emergency monitoring can be achieved without slipping into long term mass surveillance.

The need for ethical and legal framework

The potency of digital measures during the coronavirus has highlighted the need for an ethical and legal framework for data governance. GDPR serves as a strong foundation for data privacy and dignity. Answering these five questions is essential for principled policy-making around data governance in times of crisis.

First, when can private data be accessed? Governments should only be allowed to access personal data during officially declared states of emergency and with the approval of an independent and transparent process such as through a parliamentary vote. When it comes to street cameras and the facial recognition technologies that rely on them, many governments have taken authority but not set limits on themselves.

Second, which types of data can be accessed? Only that data that is at a minimum necessary to identify at-risk locations or individuals such as telcolocation data, but with strict regulatory oversight for a specific period. Beyond this, search engine behavior and social network usage for modeling and forecasting must be aggregated and anonymized, as Google and Facebook are currently suggesting to the US government.

Third, under which circumstances should data sharing be allowed within and outside the government? As the current pandemic response illustrates, ministries of technology, health, transportation, and others may need to share information but should justify it based on specific use cases. If corporate third parties are supporting public efforts, the data should be stripped of personal identifiers. Furthermore, data should never be shared with third parties for purposes such as advertising or other services.

How much public disclosure is required?

Fourth, when can citizens access government-stored data and request deletion? At the outset, citizens should be enabled to view which data the government has stored about them. After a specified period, individuals should have the right to have their data deleted.

Fifth, how much public disclosure is required by the government? Governments should be obligated to frequently communicate their data-related strategies in order to maintain public trust. This is particularly important given the rise of AI for data mining, about which there is little public understanding.

These frameworks are important not only in the current pandemic, but as we encounter new crises and deploy emerging technologies. (For example, under what circumstances, if any, would you condone a tracking chip under your skin?) All countries are now in a phase of experimentation in using technology to address the public health crisis of the pandemic. Unlike 9/11, however, today there is an opportunity to more strongly urge adherence to standards of data governance that balance effectiveness and privacy.