This post is in two parts, this is Part I dealing with the illusion of cybersecurity and its relationship to knowledge and vulnerability in the context of Cybersecurity.
Part II deals with the question of WHY? – An exploration into the “Cause and Effect” relationship in cyber threat with respect to the “Arrow of time” principle, and the relationship between Threat, Vulnerability, Actor and Motivation.
The Part I of the post is inspired by Daniel J. Boorstin’s quote from the book : The Discoverers: A History of Man’s Search to Know His World and Himself – “The greatest enemy of knowledge is not ignorance; it is the illusion of knowledge.”
This is a well-known phenomenon in cognitive behavior science, popularly known as the “Illusion of Knowledge”. One aspect of this illusion is that we easily mistake surface understanding for deep understanding, what experts call the “illusion of explanatory depth.” That aspect of the illusion of knowledge leads us to think we have a deep understanding, when all we have is knowledge of the surface properties.
The real world is a big scary place, and the last thing we wanted to be constantly reminded is how little we understand. The illusion of knowledge is necessary to keep us from facing our incompetence and also for the survival of the human species. Daniel Kahneman, the renowned psychologist and winner of the Nobel Prize in Economics explains the rationale in his bestselling book “Thinking Fast, Thinking Slow”. Some of the other books on cognitive behavior that give us deeper understandings of this phenomenon are “Predictably Irrational” by Daniel Ariel, “The Hidden Brain” by Shankar Vedantam.
This phenomenon is called as the Dunning–Kruger effect
David Dunning and Justin Kruger (1999) of Cornell University who developed this theory concluded, “The miscalibration of the incompetent stems from an error about the self, whereas the miscalibration of the highly competent stems from an error about others.”
Dunning and Kruger proposed that, for a given skill, people under “illusion of knowledge” will:
1. Tend to overestimate their own level of skill.
2. Fail to recognize genuine skill in others.
3. Fail to recognize the extremity of their inadequacy
4. Recognize and acknowledge their own previous lack of skill, if they are exposed to training for that skill.
Now this principle can be extended to Organizations too, which is nothing but a collection of human cognition.
I believe that this theory can be extended to Cybersecurity, hence my statement “The Greatest Enemy of CyberSecurity is not ignorance of cyber security, but It is the Illusion of cyber security.” Stuart Skamp, (1965) and many other researchers have documented the perils of too much information, leading to what is known as overconfidence effect which in turns leads to illusion, this illusion driven by overconfidence typically leads to overestimation of an organization’s performance in cybersecurity, this Illusory superiority leads to Illusion of Control (Ellen Langer) and Planning Fallacy (Daniel Kahneman and Amos Tversky (1979))
From a risk perspective this illusion, this brings forth two key concerns and consequences:
1. Risk is Certain, Denial is not an Option: The illusion of security and failure to recognize true risks and the non-action that follows will lead to more cyber attacks. Robert Muller Former FBI Director has said “There are only two types of companies: Those that have been hacked, and those that will be. Even that is merging into one category: Those that have been hacked and will be again.”
2. Social Amplifications of Risk: The world is becoming more and more interconnected, and the world is becoming flatter as explained in “The World Is Flat: A Brief History of the Twenty-First Century” by Thomas L. Friedman. In the Age of Empowerment, Convergence, and Innovation Powered by the Social Oil is bringing to light new risks, primary being Social Amplifications of Risk. In a closed world, the illusion of cyber security posed very little risk, but in the new world powered by SMACT the illusion of cyber security amplifies the actual risk.
This illusion is caused by what is known in psychology as Cognitive Bias. It is good to have an understanding of the various cognitive biases that affect an Organization’s Decision-Making, Organization’s Value systems, and Organization Behavior.
How do we address this from a cyber security perspective?
The beauty of an illusion is that once you know it and acknowledge it, it is no longer an illusion. The true nature of illusion is beautifully explained in the Indian Philosophical concept known as “Maya”. Under darkness (ignorance) you may mistake a rope for a snake and will inhibit the natural fear response, but when the darkness is removed (by knowledge) and you realize that the snake was only a rope, all your fear and the accompanying reactions will evaporate immediately.
When I mean “known/unknown to others” that that means it could be “known/unknown” to the good guys or “known/unknown” to the bad guys. The implications are different. The word Vulnerability should be viewed from a larger context; this is not just technology vulnerabilities (as found CVE database), but also human vulnerabilities. I would define Vulnerability as the inability to withstand the effects of a hostile environment and being susceptible to being wounded or hurt or attacked. Vulnerability is a weakness in a system/human that can potentially be exploited by an attacker. The risk presented by that vulnerability is based on the likelihood that an attacker (actor) will take advantage of that vulnerability and “exploit it” for a specific motivation. Even though in reality, not all vulnerabilities are readily or reliably exploitable, the risk is present, and it could be a matter of time before somebody comes up with an exploit if the motivation is really compelling.
Known Known : If the vulnerability is known to you and known to others (e.g. a published vulnerability with a CVE number), and if you have not acted upon it, you are either already compromised or just waiting to be compromised, the illusion that nothing will happen to you leading to your inaction will fall under the classic Dunning–Kruger effect. You are a perfect textbook candidate for an opportunistic attack or a targeted attack.
Known Unknown: If the vulnerability is known to you, but unknown to others especially the bad guys, once again you are taking a gamble the bad guys will not find this and launch an attack to harm you. You may fly under the radar for an opportunistic attack for some time, but if you are a target for a targeted APT attack, you may be out of luck. If your logic is “I will never be a target for APT” then you are in denial, and under the illusion of Dunning–Kruger effect. Adopting a secure by design approach, both at design time and run time is a good way of mitigating the risk, and also timely action to remove the cause of the vulnerability is the recommended approach.
Unknown Known: If the vulnerability is unknown to you and known to the attackers, then it is not a good situation. If it is known to others then it should be easy for you to get an insight into it too, A comprehensive vulnerability assessment approach can reduce the risk. Looking out for published and known vulnerability and moving into the quadrant of known/ known is a good way. If the vulnerability is unknown to you, but known to the good guys ( the white hats), making sure you have good information sharing and collaboration (threat intelligence sharing) as a good mechanism to address the risk before the attackers eventually get to know this. Please be assured the bad guys will eventually know this and try to exploit it.
Unknown Unknown: Here the vulnerability is neither known to you nor others at this point in time. This falls under the famous Donald Rumsfeld statement after the 9/11 attacks. This category is where threat R& D and threat intelligence comes to play. Partnering with a good Security research firm and partnering with a trusted cyber security partner can protect you from zero-day attacks, Also if for some reason the bad guys get to discover this first and makes an attack, it is imperative that this new attack information be shared between nations, enterprises, individuals to prevent further copy cat attacks. An organization that adapts, mutates and evolves constantly by scanning the environment for new information through a discovery process, and is willing to adapt, and change will survive and flourish.
While this may look simple when just viewed from the lens of vulnerability, in reality it is not. In cyber security failing to recognize uncertainty (The risk of the known and not acting upon it, and the risk of not discovering the unknown) has huge consequences. More than the actual knowledge of the vulnerability itself, it is the ability or capability of the organization to discover the vulnerability especially internal weaknesses that it has control over and act on it in a timely manner is the key.
To get a better understanding of the above, we will need to know the relationship between Threat and Vulnerability leading to Risk and Exploit through the lens of the Theory of Causation i.e. Cause and Effect.
Part II of this post “Take away the Cause and the Effect Ceases” will explain the relationship between cyber threat and vulnerability, the need to understand the relationship and its consequences in the “Arrow of Time” continuum.
Capgemini has introduced a consolidated security service called Cybersecurity Global Service Line, integrating its expertise in cybersecurity.
We provide end-to-end advisory, protection, and monitoring services to secure your organization. To find out more visit www.capgemini.com/cybersecurity and SMACT blog series Putting cyber security at the heart of digital transformation