Cybervitals: Why Inclusion of Non-Technical Contributors Matters in Healthcare Cybersecurity

Contributor: Vidya Murthy, WEMBA’42 
To learn more about Vidya, click here.

 

The absence of diversity in security roles makes progress harder, from addressing threats to innovating with partners. How can the cybersecurity community avoid biases, i.e., decisions we make often unconsciously and without being fully aware, and develop teams that understand the full security landscape? 

Cybersecurity encompasses a variety of functions, from pen testing to incident response to training and awareness.  But one of the common issues security faces as a community is the perception by outsiders.  Media depicts all security professionals as hoodie-wearing, in front of a computer screen, and coding. To an outsider with no previous experience, cybersecurity does not seem very appealing if you don’t have a technical background.  

But according to a study by Frost & Sullivan, 30% of all cybersecurity roles are filled by people with non-technical backgrounds. According to (ISC)2, there is a cybersecurity workforce gap of over 3 million people globally, meaning the workforce needs to grow by 145 percent to help close that gap. That’s a lot of non-technical jobs. 

It’s a common trope in cybersecurity, and healthcare, to say people are the weakest link. This is often followed by stats like 23% of all data breaches are attributable to human error or negligence.  But maybe that statistic should instead be we are missing 23% of use-cases where a human’s behavior has been misunderstood and where technology failed so the human became the expected last line of defense. 

Every assumption developers make about a system’s design or user behavior leaves a crack for attacker break-ins. If everyone on the security team shares similar experiences and working methodologies, attack vectors will be missed. There is more to understand about healthcare security than the technology behind it. How might a user’s training, culture, and environment impact security?  What is the patient safety concern? Biases and assumptions about how individuals and organizations deploy and use technology can increase risk. 

In the early days of computing and connected devices, there was a lot we didn't yet know about designing secure products and environments. Today, there are established, well-known frameworks and lots of best practices and advice to help people protect data, patients, and devices.

I propose a simple exercise to consider how human factors can be missed if a team does not include a cross-section of skill sets and capabilities. 

In 2014, Apple launched its “comprehensive” health tracker app. This allowed people to track nearly everything, from daily movement and exercise to copper intake. Despite including such niche tracking abilities, Apple failed to include something that would’ve arguably been far more useful for about half of its customers: period tracking. Could that have happened if a more diverse group had been involved in the design? 

Deloitte conducted a study in 2017 that delivered multiple interesting insights. My favorite part of was what they titled the study - ‘The Changing Faces of Cybersecurity’ - but it wasn’t about diversity and inclusion. It was about the change in skill set required to be successful in this space. Specific trends identified included changing trends in job descriptions, moving away from narrow technical disciplines, and becoming more ‘esoteric.’ The report also emphasized that future cybersecurity needs expertise in privacy and security regulation. 

This ties in perfectly to the growing trend of implementing risk-based cybersecurity strategies.  In essence, risk-based means aligning technological and programmatic decisions with risk. It can be helpful regardless of whether an organization is building a program from scratch or trying to identify priorities for budgeting.  An organization reorganized priorities based on risk, which increased its projected risk reduction 7.5 times above the original program at no added cost. 

Those who took a strategy class at Wharton remember the emphasis on determining the value offered. This is precisely where cybersecurity and non-technical resources intersect. If those who are non-technical can identify the impact of cybersecurity on business, that directly translates into value to the organization. 

This can begin with a cybersecurity team asking the businesses about the processes they regard as valuable and the risks they most worry about. This doesn’t require any technical skills! Just a willingness to learn. Making this connection between the cybersecurity team and the businesses is a highly valuable step in and of itself. It motivates the businesses to care more deeply about security, appreciating the bottom-line impact of a recommended control.

In practical terms, healthcare’s cybersecurity strategy is dysfunctional at best. Just look at the headlines on any given week for a story of a hospital held by ransomware. What got us here is not going to be sufficient going forward, and we must learn to bring multiple perspectives into our collective strategy.


Contact Vidya at: [email protected]