App Faults Users Who Don’t Understand 7-Step Privacy Protocol After Data Leak

The company released a defensive statement after users discovered the app collected data on military personnel and revealed the locations of secret army bases.

How will Strava’s data breach affect digital security? Pixabay

Strava just provided a textbook example of how not to respond to a public relations disaster.

Sign Up For Our Daily Newsletter

By clicking submit, you agree to our <a href="http://observermedia.com/terms">terms of service</a> and acknowledge we may use your information to send you emails, product samples, and promotions on this website and other properties. You can opt out anytime.

See all of our newsletters

The fitness tracking app received tons of bad press over the weekend after eagle-eyed Australian blogger Nathan Ruser discovered a major flaw in its global heatmap. The data visualization collected activity for all of the app’s users worldwide from January 2015 to September 2017. It shows three trillion points of data, including one billion activities and 17 billion miles run, jogged or swam.

It’s a fact of life that most apps collect personal information about their users—and indeed, most Strava users likely opted into the heatmap without even thinking about it.

But where the app ran into trouble was with the data it collected on military personnel. Anyone who used Strava to track their fitness while working on an army base gave away their location—and as hackers soon found out, the data is very easy to de-anonymize.

Indeed, the Guardian found that the main Strava users in many Middle Eastern countries were foreign military personnel. For example, the only fitness activity in Afghanistan’s Helmand province came from army bases. Any users who zoomed in on the map could see the base’s internal layout mapped out by jogging routes and find full names and running data on 50 service members stationed there. Google Maps and Apple Maps don’t have this information, but Strava does.

Even outside direct conflict zones, Strava still holds sensitive information.

The Nevada Air Force base known as Area 51 is mostly deserted, but at least one person is still working there—Strava data shows a cyclist riding his bike through the base and surrounding area.

The Department of Defense said in a statement that it is “reviewing the situation to determine if any additional training or guidance is required.” At least one military branch, the Marines, currently allows devices with Bluetooth connectivity and GPS tracking functions to be used on base.

Given this stream of bad press, you’d think Strava would immediately respond with a concrete solution to ensure this never happens again. But that didn’t happen.

“Our global heatmap represents an aggregated and anonymized view of over a billion activities uploaded to our platform,” Strava said in a statement. “It excludes activities that have been marked as private and user-defined privacy zones. We are committed to helping people better understand our settings to give them control over what they share. For more information about Strava privacy, please visit ‘How to manage your privacy on Strava.‘”

Strava basically put the onus on users with a link to their six-month-old privacy statement. It’s a fancy way of saying “This isn’t our fault.”

But it very well could be.

Quartz reporter Rosie Spinks discovered in August that strangers could “like” her workouts even though she’d enabled “Enhanced Privacy” in the app. She found that users with stricter privacy settings still appeared on the app’s “leaderboards” (which are open to anyone) unless they enabled “Hide From Leaderboards,” a separate privacy function.

However, if a private Strava user runs with other app users whose accounts are public, their data can still be viewed—unless they toggle “Group Activity Enhanced Privacy,” another setting.

They also have to set up “Hidden Locations,” such as a home or office, which prevents people from seeing where your workouts start or end. This feature is only available in Strava’s browser version, not the app.

And there’s more—Strava users can also get data on any people they cross paths with on a given route, unless said users have enabled “Hide From Flybys.”

If users want to join a “Challenge” (such as “run a half marathon”), their data and the data of everyone else in that Challenge group is also public.

Finally, there’s a “Private by Default” option, which strips the app of all social aspects.

That’s right, there isn’t just one level of privacy settings—there are seven. If John and Jane Runner are understandably confused by this, it’s definitely more Strava’s fault than theirs.

Strava needs to take a second look at its privacy protocols and eliminate redundancies, so any user (military or civilian) knows just what privacy rights they’re giving up. But if the company continues its initial PR strategy of pointing fingers, the Strava saga will only get worse.

App Faults Users Who Don’t Understand 7-Step Privacy Protocol After Data Leak