How People And Businesses Get Blindsided By Threats
Originally published on the Black Duck Software blog on July 24, 2017
When Black Duck released the results of its 2017 Open Source Security and Risk Analysis, the results were deeply concerning. Among the audited applications, 96% utilized open source, of which 67% contained known vulnerabilities. On average, the identified vulnerabilities had been known for four years.
What allowed those vulnerable components to persist prior to the audits that discovered them? Are the involved developers, QA engineers, and managers guilty of contemptible degrees of negligence? Perhaps not. Failure to identify and respond to threats, it would appear, reaches far and wide, even to some of the biggest companies in the U.S.
Beyond Software
In its June 2017 “Strategic Readiness and Transformation Survey,” Innosight surveyed 346 U.S. managers, executives and board members in firms grossing $2 billion and above (most firms grossed $10 billion and above). While the survey focused on strategy rather than operations, here too, the authors found inadequate threat response.
“Perhaps the most striking finding is the disconnect between confidence levels and specific perceptions of threats and competition. 82% of respondents expressed degrees of confidence that their company is prepared to change in response to disruptive trends. Yet their actual activities fall short of what is required to justify their confidence. There are multiple warning signs in the data that suggest they have strategy and organizational blind spots that may undermine their ability to adapt.”
The report authors cited a several key factors, including: insufficient attention to new sources of competition, new growth products, market pace, digital investments, and AI. They call this disconnect between the warranted response and the actual response a “confidence bubble.”
Anatomy of a Confidence Bubble
What can make so many stakeholders across so many industries respond inadequately to threats? The term “confidence bubble” can imply emotional delusion — excess confidence bordering on arrogance. But given the ubiquity of the phenomenon, perhaps an explanation may more likely be found in human nature than in the particular personalities of contributors, managers, and executives. Here are just some of the forces that may be creating blind spots in your organization:
-
Misaligned Incentives. The incentives within an organization may disfavor identifying and reporting threats. For instance, if the organization offers incentives for shipping product on time, individual contributors may feel pressured to hold back information that will result in missing delivery deadlines. Similarly, if the leadership of an organization is especially vocal about cost control, a contributor may fail to escalate a threat that would require additional expenditure to mitigate. Addressing a threat early could be far less costly to the organization than addressing it after the damage is done. But in the latter case, the blame is diluted over the entire organization. In the case of escalation by a subordinate, the subordinate is likely to shoulder a disproportionate amount of the blame. No one wants to be the messenger who gets shot.
-
Automation Bias. People have a documented tendency to believe machines and automatic decision aids and ignore or underutilize outside information, even if it is contradictory and correct. But automation and information systems can only consume information they are programmed to consume. They can render their user oblivious to outside threats. For example, IT personnel of a company can have a dashboard aggregating data from virus scanners, network traffic analysis, authentication system logs, and other sources. But if that dashboard fails to include the measurement of another attack vector, such as application component vulnerabilities, then its users will have a high degree of confidence that does not correspond to their actual exposure. Similarly, business intelligence systems and dashboards used to make and support management decisions are, by their very nature, backward‑looking. They provide analysis and inference based only on past data. Because these systems are reactive rather than proactive, instinctive overreliance on them can create blind spots when anticipating future threats.
-
Cognitive Dissonance. People have a bias toward avoiding information that contradicts their previously established beliefs. So even if threat information enters a stakeholder’s field of awareness despite the aforementioned obstacles, that stakeholder may be more likely to find an excuse to dismiss it rather than to change his/her beliefs.
Burst Your Confidence Bubble
Confidence bubbles need to be cured and prevented proactively. Here are a few questions leaders at every level should frequently ask:
-
Have I communicated recently and frequently that I want to hear bad news as soon as possible? Is everyone on my team aware that I will thank, not blame, the messenger? Have I demonstrated to my team that I am eager to constructively address problems once brought to my attention?
-
What are my current tools and metrics not telling me? What are the threat vectors that we are not currently considering and measuring? How can we measure them?
-
What are my current beliefs about our threat readiness? What concrete steps will I take when presented with information that contradicts those beliefs?