The Importance of Digital Privacy (Part 2)
In part one of this series, I outlined why governmental institutions are falling far behind the tech industry in terms of regulation and how it is crucial for the individual to be knowledgeable and proactive about protecting their information. I also went further in depth about Facebook and Google and gave a few examples of how they’ve violated the trust of their users. This week, I’ll talk a bit about Instagram (owned by Facebook) and Twitter, with a focus on geolocation.
Most, if not all users on Instagram are aware of geotagging, the way to publicly attach a location to photos that they post. This is presented to the user as an option when they are posting a new photo.
You don’t have to enable location services for Instagram if you’d still like to be able to geotag your photos, despite their best efforts. Decline both of the popups in the images below and it’ll work just the same, without Instagram tracking everything by default.
What most users don’t know is if they’ve enabled Instagram to have access to their location data and make a post without manually adding a geotag, Instagram is still storing that data. What they do with this data is unclear, however if the track record of Facebook is anything to go off of, one can be pretty sure the privacy of the user base is near the bottom of their concerns.
Personal Data Maintenance
One useful tool that Instagram has is the ability to download “all” the data that they have stored on you. (The reason I put all in quotes is because it is the information that they are comfortable letting you know that they have stored.) If you want to see this data, navigate to Instagram on a computer and click on the settings icon on your profile. Click the “Privacy and Security” option.
From there, scroll down to the bottom and hit “Request Download”.
From there, it can take Instagram up to 48 hours to send you an email with a zip file, depending on how much information they have stored on you (it took about 5 minutes for me).
A few months ago, Twitter revealed that they had mistakenly been collecting and sharing the location data of an undisclosed number of iOS users. If a user had multiple accounts on the platform where only one had enabled location services, Twitter was tracking and storing the location data for all accounts logged in. Also, they state that location data was accidentally being sent to a third party during a process that they call “real-time bidding”.
For more info on what real-time bidding is, you can view MoPub’s website on the subject. MoPub was acquired by Twitter in 2013 and are the creators of the real-time bidding that Twitter is referring to. Essentially, it is a marketplace for companies to bid money on user impressions, and the user impressions are tailored to the companies based on the data that Twitter has stored on its users.
Opt-Out vs Opt-In
I’d like to highlight a problematic sentence in the statement that Twitter released:
“We invite you to check your privacy settings to make sure you’re only sharing the data you want to with us.”
This is an issue that is bigger than this one instance, and shows what’s wrong with having users opt-out by default as opposed to opting in. By requiring users to opt-out of sharing their data, it puts the onus of protecting their privacy on them instead of the companies. The responsibility of making sure that users are only sharing exactly what they intend to with Twitter should fall on Twitter itself, not the user.
According to a study done by researchers from Cornell and Stanford on the policies of organ donations in various countries:
“In these so-called opt-out countries, more than 90% of people donate their organs…In these opt-in countries, fewer than 15% of people donate their organs at death… People tend to conform to the status quo.”
This study perfectly demonstrates the difference between these two methodologies. The majority of people will default to whatever the default is on the platform that they’re using. If a company cares most about harvesting data from its users, it will use the opt-out model. If a company values the customer first and wants what’s best for them, they will implement an opt-in model.
Conclusion
As I said in part one of this series, I believe that it is crucial for more people to be conscious of the depth and breadth of the violations of our privacy that are happening all too often. Through this, it is possible to be protected in an increasingly volatile and untamable digital world. It’s impossible to know everything there is to know about this, but the more that these topics are discussed, the faster policies will begin to change.