Privacy is a huge deal at Symantec. As one of the world’s largest security companies, we ‘ve made company-wide commitments to being responsible stewards of customer data.
Symantec has a dedicated function for privacy that is separate to the cyber security team that reports to me. My team naturally feel a strong sense of responsibility for driving good privacy outcomes.
Today, on Data Privacy Day, I’ve been reflecting on how the role and approach of security practitioners to privacy needs to evolve to meet the heightened expectations of the community we serve.
The remit of the security function is usually described as being to protect the confidentiality, availability and integrity of data. You might instinctively assume that our role in privacy relates most to confidentiality – specifically, to prevent unauthorized disclosure of PII to external parties. At Symantec, we obsess about keeping unauthorized users and malware off our networks and implementing data protection controls to monitor and control the flow of data from users off the network.
But that’s not where the conversation about confidentiality and privacy ends.
At Symantec, we obsess about keeping unauthorized users and malware off our networks and implementing data protection controls to monitor and control the flow of data from users off the network.
Customers also reasonably expect security teams to monitor and control who has access to PII data inside the company. They expect it to only be accessed on a ‘need-to-know’ basis. But many organizations have fallen short on these expectations. Consider the following.
- Key staff need to be trusted to access repositories of customer data as part of their work. But should they be trusted to access all customer data at once?
- Health professionals need to be trusted to access the records of patients in their care. But should they be allowed access to records of patients they didn’t directly treat? (Ask George Clooney, Ed Sheeran, Kim Kardashian and Britney Spears).
- A loan officer needs to be trusted with as much information as possible about an applicant to make responsible decisions about their credit worthiness. But should they have access to that client’s medical information?
- Developers of apps in which users agree to have their location tracked and stored may need to be trusted to access that data to refine their algorithms and further improve the service. But does that use case necessitate access to real names or other PII data?
- Internal auditors need to be trusted to access internal records, such as HR files. Should they be trusted to access large numbers of staff records? What is the appropriate threshold?
These examples raise questions about whether traditional approaches to access control are enough to meet heightened demands around privacy. Most organizations use some form of a role-based access control regime – assigning access to resources according to broad, pre-defined roles. Most review access control on a regular basis to ensure business justification remains valid. But the examples above suggest this approach doesn’t account for all security and privacy use cases.
An alternative and more-involved approach is to work towards policy-based (aka ‘attribute-based’) access control. This is best thought of as a more complex set of conditions (‘IF, THEN’ statements) that determine whether a subject can access a resource at a given time.
The idea is to implement fine-grained rules that can be dynamically fine-tuned according to security and privacy requirements.
So, in the healthcare examples above, a health practitioner can only access the record of a patient for the time the patient is checked in under their care, and only from a managed device that was given a clean bill of health via a recent scan and is connecting via a known, trusted pattern of behavior. A combination of these attributes may also determine whether some fields are not accessible, whether PII data is tokenized or whether the file can be written to or exported.
Irrespective of the access model deployed, a more complex privacy landscape demands that security practitioners devise more fine-grained, adaptive means of determining what access is appropriate, and to codify these rules using the tools available.
Technical measures aside, our threat models must unfortunately assume that a person with sanctioned access to a resource might choose to abuse that access. This is as much a question of organizational culture as it is of controls.
On this Data Privacy Day, I will be asking my team what more we can do in our daily interactions to instill a duty of care for customer data across the organization.