CovidSafe legislation must clarify existing privacy measures to boost app's efficacy
30 Apr 2020 4:56 am by Laurel Henning
Three million Australians have downloaded the government’s CovidSafe contact tracing app in three days, as the government followed an Asia-Pacific wave in a global trend to introduce Bluetooth-based apps designed to help stop the spread of Covid-19.
Among Western governments, Australia has been ahead of a curve, with plans for apps in France, Germany, Italy and the Netherlands facing hurdles due to privacy concerns, despite Covid-19 case numbers being much higher in Europe.
With the Australian government expected to encourage use of the app beyond a three-month shelf life of the rules governing the technology today, new legislation tailor-made for the app must more clearly define “de-identified information” and the potential use of data the app collects.
It’s true that the “determination,” which governs the app for now and could lead to maximum prison sentences of five years imposed for any attempted data breach, will slow an initial rush to pass legislation to govern the app.
But some important clarifications need to be made, with some experts arguing for tougher penalties under the anticipated CovidSafe regulation.
The app relies on a “Bluetooth handshake” between two phones using the app, with the app detecting and logging when subscribers are within 1.5 meters of one another for more than 15 minutes. This means that instances of contact are captured, but specific location information is not.
That said, Android devices require all apps that request access to Bluetooth to also obtain location permission, meaning that CovidSafe will request access to location permissions when installed on an Android device. But the app does not store or use the location data, the app’s “frequently asked questions” section explains.
On installing the app, users enter their name, mobile number, age range and postcode. All of this information is then assigned a user ID, which is regenerated on a rolling basis to protect individual identity being linked to Bluetooth signals.
When testing reveals that an app subscriber has tested positive to Covid-19, they can decide whether to provide health officials with access to contact data, which the app has stored in an encrypted form on the user’s phone up to that point.
An infected user then agrees to upload the ID data on their device to the central data storage — a service being provided by Amazon Web Services, or AWS. Public health officials then provide an individual with a Personal Identification Number, or PIN, that would upload contact data to that central store if entered into that device. But subscribers can still choose not to do that.
Under the determination governing the app, information is “de-identified” if it is no longer about an individual who is identified or reasonably identifiable.
In an interview with MLex, Sydney-based lawyer Patrick Fair said the definition is “one possible weakness” of the rules governing the app.
“There is some concern that the determination doesn’t provide guidance about the meaning of reasonably identifiable,” he said.
“For example, if in a list, names have been replaced by numbers, one might say the individuals can be uniquely identified. But because they can’t be located without a key that associates the numbers with the names and contact details, are these individuals reasonably identifiable?”
In some circumstances it seems the information is de-identified and in others it might not be.
“Better clarity around this issue would improve the privacy protection,” Fair said.
A note on the app from law firm Gilbert & Tobin notes that permitted use of “de-identified information for statistical purposes […] may carry some inherent de-identification risk.”
Privacy concerns were a hurdle to getting the Australian app up and running in the first place, so getting this right is vital to the app’s wider uptake and chances of success in aiding a reduction in the spread of Covid-19.
The wording of this point raises questions of necessary disclosure and limits to prosecution use.
Fair told MLex that current measures imply that “if someone tried to get the contact information collected by the app under subpoena for a prosecution, perhaps to demonstrate you were somewhere you shouldn't have been, they would commit an offence.”
New legislation could also more clearly end the collection of data through the app.
“The time and circumstances of ending should be specified,” Fair said, adding that this could be done through provisions requiring all relevant personal information to be deleted.
“Even information which has been downloaded and made available should be de-identified and deleted once the contact tracing period is over. An express requirement like this would protect the information from unlawful use and/ or unauthorized access or loss,” Fair said.
Robyn Chatwood, a partner with law firm Dentons, told MLex in an interview that there is a concern over how the app talks to, or relates to other pieces of Australian legislation.
Chatwood gave the example of Australian laws that allow law enforcement agencies to request access to encrypted phone data.
But for now, the dedicated government website for the app says it “cannot be used to enforce quarantine or isolation restrictions, or any other laws.”
It’s not just the privacy rules that have garnered the attention of legal experts. Specific features of the app’s registration form have also sparked interest.
Chatwood said “that they’ve justified that they want to collect users’ age range on the basis that it will assist the government to prioritize response to reports of positive cases needs to be unpacked.”
As it stands, that prioritization could be read in a number of ways. Either older app users could be prioritized over younger users or not – yes, because they are more vulnerable to the coronavirus and no, because younger people are more likely to spread it, perhaps.
Policymakers should consider clarifying this prioritization of cases.
While the information is encrypted on a user’s phone, the area of phone storage used is itself not encrypted.
While a criminal offence to attempt to access the data, the way in which the information is stored on users’ phones makes it possible to extract the data using an unlocked or “jailbroken operating system.”
Then there’s the question of centralized storage once the data has been sent to government health authorities.
Chatwood says that “naturally, there is a loss of control over personal information collected through the app once the information is disclosed to government and held in its data store.”
“There are no guarantees of privacy with any technology. My concern is that centralized databases, such as the proposed data store, are attractive assets to cyber-criminals,” Chatwood said.
Still, the app is a significant step forward in cutting out the reliance on human memory when patients diagnosed with Covid-19 try and recall for health authorities a list of people they may have been in contact with.
The app goes beyond friends and family to include someone else on the bus, or in a queue for a take-away coffee.
The relaxation of social-distancing restrictions in Australia may depend on the app’s take-up and while three million downloads in three days is impressive, seven million more downloads are needed to achieve a minimum 40 percent of the population to make the app effective.
13 Jan 2021 10:29 am by Wooyoung LeeScatter Lab, the South Korean developer of Artificial Intelligence chatbot named Iruda, is the target of a new investigation by the South Korean privacy regulator.
08 Jan 2021 11:20 pm by Dave PereraA quiet death at the US Federal Communications Commission of a Trump-era attempt to weaken the legal liability shield for online platforms won’t be the end.