Anonymous Data Aggregation and Unintended Consequences
Data from fitness websites and trackers reveals U.S. military bases around the globe.
The military goes to great lengths to keep the locations of its secret bases just that—secret. But troops wanting to get a little exercise while deployed may have inadvertently given away the location of such bases, supply routes and more.
The fitness tracking site Strava recently published a global heat map showing areas where its users have exercised over the last few years, which might be neat to look at when checking out cities like New York, Boston or Los Angeles. But some intrepid browsers of the data found small hot spots of athletic activity in far-flung locales of Afghanistan, Iraq and Syria. Turns out the troops tracking their activity through FitBits and other devices have provided rough outlines of military bases and the routes in and out of the those bases.
This operational security failure isn’t the result of a single person’s data being shown. The data is pulled anonymously from 27 million Strava users, but the aggregation reveals trends the military would prefer people didn’t see.
This event harkens back to the late 1990s when Dominos said it could predict crises and potential military actions based on upticks in pizza deliveries to key government building in Washington.
A spokesperson for the Secretary of Defense says the agency is aware of the breach and that, “We are taking a look Department-wide at our policies.”
Lessons to be Learned
Obviously, it falls on the individuals working in sensitive areas to know what data they may be broadcasting out to the world, anonymous or not. Data privacy regulations such as the EU’s GDPR won’t necessarily help, as they allow for analysis of anonymized data.
From the vendor side, there are a number of things to consider: First, there’s the fallout of unintended consequences when releasing massive amounts of data to the public. Chances are Strava didn’t realize what its maps detailed.
There’s also a need for better end-user controls and education. License agreements spell out how data will be used, but let’s be honest, it’s likely no one reads them. It should be made simple for users to turn certain data off when needed.
This event also reinforces the need for companies to protect customer data. As this C4ISRnet article asks: What other data does a company like Strava have that might be interesting to prying eyes? What if a malicious party could gain access and pair user information with routes? It sounds like a real-life episode of NCIS.