
Australia’s COVIDSafe app was launched by the Australian Government in April 2020. The app uses Bluetooth technology to record “contact events” or “digital handshakes” between app users, which are stored on users’ phones for 21 days. Contact events include the encrypted ID of the other contact user, the Bluetooth signal strength during the event, and its duration and time (but not location data). If a user tests positive, this information is uploaded to the National COVIDSafe Data Store (a cloud-based data repository supported by Amazon Web Services and administered by the Digital Transformation Agency), where it can be decrypted for use by state contact tracers.
The app has been hampered by concerns about its security, privacy, and effectiveness. Amendments to the federal Privacy Act (1987) created a legislative framework for protecting the privacy of app data and preventing ‘function creep’, i.e., the risk of data being used for purposes other than contact tracing, such as law enforcement. This framework follows the same format as other privacy laws in Australia (such as legislation that applies to the MyHealth Record System), setting out a series of permitted uses, collections, and disclosures of app data related to contact tracing and maintaining the data store and the app. All other collections, uses, or discloses are prohibited, as is uploading app data from a user’s device to the data store without their consent, retaining or disclosing data to someone outside Australia (unless for contact tracing purposes), and decrypting app data on a user’s device.
Crucially, the legislation protects voluntary use, for example, by making it an offence to require someone to download or use the app, or to refuse to provide them with goods or services because they’re not using it. The legislation also creates a mechanism for dismantling the system when it is no longer needed, and for deleting the information contained in the data store.
The basic legislative privacy protections on the app are sound, although commentators have identified some ways in which they could be strengthened, for example, by providing for the periodic removal of contact event data from the data store.
Where the system really falls down is in the design and operation of the app itself. This invokes the concept of privacy by design, i.e., building privacy protections into the physical design, architecture, and computer code of the device or system concerned. Privacy in the digital realm can be protected through multiple channels, including contractual mechanisms, legislation, and design-based solutions. The physical design of the system or device is at least as important – if not more so – than any legal frameworks that apply. This is often referred to as “code” or “architecture”-based regulation, and it’s interesting to consider whether or how the privacy and transparency concerns raised below could also be addressed through legislation.
Privacy advocates and tech experts have extensively canvassed the security and privacy flaws in the app, as well as technical problems that prevent it from operating effectively. This report, by a group of software developer and cybersecurity experts, provides a comprehensive and readable summary. Some of the early bugs included “phone model and name being constantly exposed and unique identifiers being available to track over time… undetectable, permanent long-term tracking of iOS and Android devices and attackers being able to control devices remotely” (p.7). The authors point out that many of the app’s technical challenges stem from the use of Bluetooth for a function it wasn’t originally intended for, i.e., continually and indefinitely scanning the environment for other devices, and then making connections with them.
They also say that some of the technical issues with the app resulted from a lack of consultation with tech experts (and the wider community) during its development, as well as a lack of testing and verification.
Also concerning has been the DTA’s slow response to concerns raised by the tech community once the app was launched, as well as limited transparency in the scheme’s operation. This includes the DTA’s failure to release the number of active users, and the Government’s reluctance to release the full version of an independent report on the app’s operation, which found that the app imposed significant time costs on contact tracers for no little additional benefit. Some of this information was omitted in a shorter version of the report originally made publicly available.
The Government has taken steps to address some of the bugs in the app, including through the adoption of the “Herald” protocol in December 2020, although the authors of the report mentioned above say this protocol still has problems, and in fact reintroduced some issues that had been fixed previously. They call for the Government to adopt the Exposure Notification Framework developed by Apple and Google, which doesn’t create the same privacy and security challenges as the Covidsafe app.
There have also been developments in the responsiveness and transparency of the scheme. For example, the DTA has identified a contact point for security concerns, and in April 2020, it made publicly available the full source code for the app, which is hosted on a Github repository. But, according to researcher Emma Blomkamp, the early lack of community engagement was a missed opportunity to build public acceptability of the app or a ‘social licence to operate’ (particularly among Australia’s diverse communities) and to inform the public about the app’s operation and the privacy protections that would apply.
Trust in government is crucial to an effective response to the COVID-19 pandemic. By now, we all know that governments possess highly coercive powers for responding to public health emergencies. But to a significant extent, governments must rely on people voluntarily doing the right thing, including downloading the Covidsafe app and sharing their personal information with contact tracers. That’s much more likely to happen when people trust the government, and that trust is much more likely when there’s a transparent and accountable system in place, combined with rigorous privacy protections, both “code” and law based.
This is an area where a fast rollout shouldn’t have come at the expense of a responsive, transparent, or accountable one.
Leave a Reply