Sat. Sep 24th, 2022
Several rows of seats outside are empty except for two men staring at their smartphones.
enlarge About 44 percent of COVID-19 apps on iOS request access to the phone’s camera. 32 percent requested access to photos.

When the idea of ​​using smartphones to help fight the COVID-19 pandemic first surfaced last spring, it sparked months of debate: should apps collect location data, which can help track contacts, but potentially sensitive information? to reveal? Or should they take a narrower approach and only measure the Bluetooth-based proximity of other phones? Now, a wide-ranging survey of hundreds of COVID-19-related apps reveals the answer to all of the above. And that has turned the COVID-19 app ecosystem into a kind of wild, vast landscape, full of potential privacy pitfalls.

Late last month, Jonathan Albright, director of the Digital Forensics Initiative at the Tow Center for Digital Journalism, released the results of his analysis of 493 COVID-19-related iOS apps in dozens of countries. His study of those apps, which tackle everything from symptom tracking to telehealth consultations to contact tracing, catalogs the data permissions everyone is requesting. At WIRED’s request, Albright then further split the dataset to focus specifically on the 359 apps that handle contact tracing, exposure notifications, screening, reporting, workplace monitoring and COVID-19 information from public health authorities around the world.

The results show that only 47 of that subset of 359 apps use Google and Apple’s more privacy-friendly exposure notification system, which limits apps to only Bluetooth data collection. More than six of the seven COVID-19-focused iOS apps worldwide are free to request any privacy rights they want, with 59 percent asking for a user’s location when in use and 43 percent asking for a user’s location. follows at all times. Albright found that 44 percent of COVID-19 apps on iOS requested access to the phone’s camera, 22 percent of apps requested access to the user’s microphone, 32 percent requested access to their photos, and 11 percent asked for access to their contacts.

“It’s hard to justify why many of these apps need your constant location, your microphone, and your photo library,” says Albright. He warns that even COVID-19 tracking apps built by universities or government agencies — often at the local level — risk putting private data, sometimes linked to health information, out of the users’ control. “We have a lot of different, smaller public entities that are sort of developing their own apps, sometimes with third parties. And we don’t know where the data is going.”

The relatively low number of apps using Google and Apple’s exposure reporting API compared to the total number of COVID-19 apps should not be seen as a malfunction of the company’s system, Albright emphasizes. While some public health authorities have argued that the collection of location data is necessary for contact tracing, Apple and Google have made it clear that their protocol is for the specific purpose of “exposure notice” – warning users directly about their exposure to other users who have tested positive for COVID-19. That doesn’t include the contact tracing, symptom monitoring, telemedicine, and COVID-19 information and news that other apps provide. The two tech companies have also restricted access to their system to public health agencies, which has limited its adoption by design.

“Almost as bad as you’d expect”

But Albright’s data nevertheless shows that many US states, local governments, workplaces and universities have chosen to build their own systems for tracking, screening, reporting, exposure warnings and quarantine monitoring of COVID-19, perhaps in part as a result of the limited focus and data limitations. Of the 18 exposure warning apps Albright counted in the United States, 11 use the Bluetooth system from Google and Apple. Two of the others are based on a system called PathCheck Safeplaces, which collects GPS information but promises to anonymize users’ location data. Others, such as Citizen Safepass and the CombatCOVID app used in Florida’s Miami-Dade and Palm Beach counties, request access to users’ location and Bluetooth proximity information without using Google and Apple’s privacy-restricted system. (The two Florida apps requested permission to track the user’s location within the app itself, oddly not in an iOS prompt.)

But those 18 exposure reporting apps were only part of a larger category of 45 apps Albright classified as “screening and reporting” apps, whose functions range from contact tracing to symptom recording to risk assessment. Of those apps, 24 asked for location while the app was in use, and 20 always asked for location. Another 19 requested access to the phone’s camera, 10 requested access to the microphone and nine requested access to the phone’s photo library. A symptom registration tool called CovidNavigator inexplicably requested Apple Music data from users. Albright also examined a further 38 “workplace monitoring” apps designed to quarantine COVID-19-positive employees from co-workers. Half of them asked for location data when in use, and 13 always asked for location data. Only one used the Google and Apple API.

“In terms of permissions and built-in tracking, some of these apps seem to be almost as bad as what you’d expect from a Middle Eastern country,” Albright says.

493 apps

Albright collected his research of 493 COVID-19-related apps using data from apps analytics companies 41matters, AppFigures and AppAnnie, and by running the apps himself while using a proxy connection to monitor their network communications. In some cases, he sought public information from app developers about functionality. (He says he limited his research to iOS rather than Android because there have been previous studies that focused solely on Android and raised similar privacy concerns, although far fewer apps were examined.) Overall, he says the results of his research does not point to any necessary nefarious activity, such as a sprawling COVID-19 app marketplace where private data flows in unexpected and less-than-transparent directions. In many cases, users have little choice but to use the COVID-19 screening app that is implemented by their college or workplace and is not an alternative to any app that their state’s health authorities ask users to use.

When WIRED reached out to Apple for comment, the company replied in a statement that it is carefully vetting all iOS apps associated with COVID-19, including those that do not use the exposure notification API to ensure they are are developed by reputable organizations such as government agencies, health NGOs and companies certified in health matters or medical and educational institutions, and to ensure that they are not misleading in their requests for data. In iOS 14, Apple notes that users will be notified with an indicator dot at the top of their screens when an app is using their microphone or camera, and will let users choose to share approximate locations rather than fine-grained locations with apps.

But Albright notes that some of the COVID-19 apps he analyzed went beyond direct requests for permission to verify user location to include ad analytics as well: while Albright didn’t find ad-focused analytics tools built into exposure notifications. or contact tracing apps, he found that, of the apps he classifies as “information and updates,” three used Google’s advertising network and two used the Facebook Audience Network, and many other integrated software development kits for analytics tools, including Branch, Adobe Auditude, and Airship. Albright cautions that all of these tracking tools could potentially reveal users’ personal information to third-party advertisers, including potentially even users’ COVID-19 status. (Apple noted in its statement that starting this year, developers must provide information about both their own privacy practices and those of third parties whose code they integrate into their apps in order to be accepted in the app store.)

“Collect data and then monetize it”

Given the rush to create COVID-19-related apps, it’s not surprising that many are aggressively collecting and, in some cases, taking advantage of personal data, said Ashkan Soltani, a privacy researcher and former chief technology officer at the Federal Trade Commission. “The name of the game in the app space is to collect data and then monetize it,” says Soltani. “And there’s essentially an opportunity in the market because there’s so much demand for these kinds of tools. People have COVID-19 in their brains and that’s why developers are going to fill that niche.”

Soltani adds that by allowing only official public health agencies to build apps that access their exposure reporting API, Google and Apple built a system that pushed other developers to create less restricted, less privacy-protecting COVID-19 apps. build. “I can’t build an exposure notification app that uses Google and Apple’s system without consulting public health authorities,” Soltani says. “But I can build my own arbitrary app without any supervision other than App Store approval.”

Concerns about data misuse also apply to official channels. In recent weeks, the British government has said it will give police access to contact tracing information and, in some cases, issue fines to people who do not self-isolate. And following a public response, the Israeli government has reversed a plan to share contact tracing information with law enforcement so it can be used in criminal investigations.

Not necessarily evil

Apps that request and collect location data in a centralized manner don’t necessarily have shady intentions. In many cases, knowing at least elements of an infected person’s location history is essential for effective contact tracing, said Mike Reid, an infectious disease specialist at UCSF who also leads San Francisco’s contact tracing efforts. In contrast, Google and Apple’s system prioritizes user privacy but does not share data with health authorities. “You leave the responsibility completely up to the individual, which makes sense from a privacy standpoint,” Reid says. “But from a public health standpoint, we would be completely dependent on the person calling us, and it’s unlikely that people will.”

Reid also notes that, with Bluetooth data alone, you’d have little idea of ​​when or where contact with an infected person would have occurred — whether the infected person was inside or outside, wearing a mask at the time, or behind a plexiglass. barrier, all factors whose importance has been better understood since Google and Apple first announced their exposure reporting protocol.

All of this helps explain why so many developers are turning to location data, even with all the privacy risks associated with location tracking. And that leaves users with the privacy implications and potential health benefits of an app’s request to search location data itself, or to take the easier way out of the minefield and just say no.

This story originally appeared on wired.com.

By akfire1

Leave a Reply

Your email address will not be published.