Privacy and Security: What’s the Difference?

Article Summary:

  • Security controls are put in place to control who can access the information, but privacy controls are for when and what they can access.

  • For example, doctors at a nationwide primary care chain most likely have security access to each patient’s account information across the country, but should only have privacy access when there is a “business need,” like an x-ray or lab test.

  • On May 25th, EU’s new privacy regulation (GDPR) goes into effect, seeking to strengthen and unify data protection for individuals by prohibiting companies from forcing consumers to give up their data as a term and condition of using their service.

 

As data collection has increased, so has controversy. Most of this data has been willingly given by us users in the form of our computers, smart phones, and more recently, smart home devices, cars, and even refrigerators. International regulatory eyes have turned their gazes to some of these massive organizations collecting our data.

Big questions are being examined and litigated like “what constitutes consent?” Along the way, simple words like “security” and “privacy” are being thrown around, but a deeper look begs the fundamental, yet deceptively complex question: what’s the difference between privacy and security?

Data Deja Vu

There is without a doubt a close link between privacy and security and the two words are frequently used synonymously. While there is often a common parallel goal between the two, the business perspective of privacy and security are very different.

When thinking about the collection of sensitive information and personal data, data security can be defined as the method of safeguarding personally identifiable information against unwanted or unauthorized access. An organization’s ultimate challenge when dealing with security is to keep the data available to the business while maintaining confidentiality, integrity, and availability. Data privacy is how that same private data is used and to ensure that it is appropriately handled. Ideally, organizations should only collect data that they need and only for specific purposes.

The relationship between the two is often explained with the adage: “you can’t have privacy without security, but you can have security without privacy.” This is best illustrated by thinking about the two together – security controls are put in place to control who can access the information and privacy controls for when and what they can access.

An everyday example in healthcare can help illustrate the difference and overlap between security and privacy. Doctors and nurses at a nationwide urgent, or primary care, services chain will most likely have been granted security access to each patient’s account information throughout the country. This allows any patient to enter any center, wherever located, with the same seamless ease as his or her local center. However, the organization’s privacy controls restrict access to the nurse or doctor only when a “business” need exists. In this example, it could be a patient walking into a patient care center for a lab test or x-ray. These same privacy controls restrict a doctor or nurse from being able to access the patient’s family member’s account, or one of their favorite celebrity’s accounts, that has been a patient in the past – despite being provisioned security access.

Now that we understand the distinction, one may ask, “who is responsible for protection personal data?” In a perfect world, a company would only collect required information, keep it safe, and then destroy the information once it’s no longer needed. But no security policy will prevent a company or trusted employee who is willing to sell or try and obtain customer data from doing so. This means that even with the recent data privacy issues in the political landscape, most likely increasing security legislation over the next few years, it is really up to the individual to be vigilant about protecting themselves by reading company privacy policies and not giving more data to companies than is absolutely necessary.

To increase vigilance, we users must truly understand our role in data collection. To paint a better picture of the controversy around data privacy and security, I’ve chronicled some of the major events in our recent history.

Recent History of Data Collection Controversies

It seems like everything we use today in our daily routine collects some amount of personal data. Although most of this data may seem harmless and most of it is given willingly by the user, the same data used to program your smart home could also be collected and tracked to provide a very intimate personal picture of an individual’s routines – how often you are home or away, purchases, health habits, and obviously where you live and who you live with.

As data collection has increased so have the scandals, leaks, and breaches. The world was introduced first hand to one of the largest data collection programs, on a scale previously thought to be unimaginable, in 2013 when Edward Snowden blew the whistle on the National Security Agency’s massive surveillance program. Collecting phone records, emails and texts from hundreds of millions of people (domestic and foreign) the NSA laid the framework of how widespread adoption of technology could be leveraged for the collection of data.

While companies and individuals began changing their practices to protect their data, through increased encryption, the issue of data security and privacy has been brought more and more to the forefront. One of the more recent events, and maybe one of the most memorable to date given the recent presidential election, Facebook has been accused of lax security around privacy of its users’ data. Although Facebook’s officially stated it was not a data breach, the security and privacy measures around user data led to an alleged illegal data mining of personal information of as many 87 million users. This data was then acquired by Cambridge Analytica, political affiliates of the Trump campaign that were hired during the 2017 presidential election campaign. In April of this year, in a direct response to the allegations, Facebook shared an updated data policy and added that the list of security changes is only the beginning. Earlier this month, Facebook announced it would be adding a new privacy feature letting users see which companies and websites are tracking them across the Internet, and will let users delete that information from their account. As a result of this failure to protect individual’s privacy, Cambridge Analytica recently announced it will be shutting down following its use of Facebook data in campaign tactics.

All of this comes on the heels of the European Union’s (EU) new privacy law, called the General Data Protection Regulation (GDPR), which goes into effect May 25th, 2018.  The new regulation will seek to strengthen and unify data protection for individuals within the EU by prohibiting companies from forcing consumers to give up their data as a term and condition of using their service. Companies, many whose business models are reliant on behavioral advertising, argue that much if not all the information collected is needed to fulfill a contract with users. Facebook for example says in order to provide their users with a “personalized experience” the data it collects is necessary. However, the debate remains on what companies really need to know (i.e. your recently visited websites) and what constitutes consent.  As Facebook continues its effort to comply with GDPR partly by asking users to agree to being shown targeted ads, by giving users an ultimatum of agreeing to the new terms or close their accounts, regulators including the lead privacy regulator in Europe are scrutinizing the decision of companies to rely on contracts to justify the collection and use of data protected under GDPR. Debates such as these will only continue once GDPR goes into effect as advocacy groups will be able to issue collective complaints, such as class action lawsuits to regulators and national courts.

One can only assume it will be a matter of time before the US sees similar regulation.

About the Author

Adam Sheehan is a Senior Consultant at Impact Makers.