The root of
the problem is that people try too hard to reconcile specialist meanings with
ordinary language. For literature and
poetry, the multiple meanings of English words and the ambiguities of syntax
are often a useful and sometimes a wonderful thing. But for practical affairs of science,
engineering, business and law, it’s a breeding ground for problems.
Other
professions have solved the problem in various ways. Mathematics precisely defines ordinary words like
“group” and “function” to have special meanings. The relatively closed nature of the
profession prevents misunderstandings among outsiders. This method would not work well in cyber
risk, as the word “risk” itself shows, because the specialists have to
communicate with non-specialists all the time.
We can’t appropriate ordinary words to mean something special only when
we talk amongst ourselves.
A variant of
the ordinary-word method is to put common words to new meanings, such as
“motor” or “inductor” were in the nineteenth century, and then rely on the
obviously new context to prevent misinterpretation.
Another way
out is to create new words that ordinary people won’t use. Biology and medicine are famous for
this. If you mean a specific kind of
mosquito or muscle, it’s anopheles or
biceps brachii. When you want to make sure that outsiders are
kept outside and suitably intimidated, a dead language is perfect! But that’s the trouble: arcane words are a barrier to communication,
and that’s the last thing we need in cyber risk.
We can
create new words out of whole cloth, instead of stealing from Aristotle and
Virgil. “Cybernetics” and “cryogenics”
are examples that do not prevent communication with lay persons. Technology is a rich source of neologisms, as
witness “transistor,” “diode,” and “automobile.”
The last way
out of the swamp of confusion, one that I find very attractive, is the noun
phrase. Here you put together a few ordinary
words in an improbable juxtaposition, such as “integrated circuit,” “tensile
strength,” or “coefficient of thermal expansion.” This seems to be the best solution. The reader needn’t have studied Latin or
Greek, she can easily see that something special is meant, and even the
non-specialist can get a sense of what the special meaning is.
To get this
movement kicked off for cyber risk, I’ll propose some of my own
definitions. I build on the excellent
foundation of the FAIR taxonomy (Factor Analysis for Information Risk), which
you can find on The Open Group website.
First let’s
agree to use “risk” by itself only as a lay term, and otherwise regard it as a
four-letter word not to be used in polite conversation. And when we use it in a lay context, let “a
risk” mean “a loss event scenario,” as advised by Freund & Jones (“Measuring
and Managing Information Risk,” p. 353).
Notice the “a”.
Here are a
few related terms and my proposed definitions.
Risk Function. The probability distribution of loss magnitudes
for some stated period of time, such as one year. This is what I think most people really mean
when they speak of the “risk” of something.
Loss Exceedance. The probability distribution that a
particular loss magnitude will be exceeded, for the given time frame, as a
function of the loss magnitude. It is
the “tail distribution” of the risk function.
This is a standard term in the insurance industry (from which we can
learn much). The loss exceedance
function has some nice properties which give it intuitive appeal.
Risk Decision. A decision by the leadership of an
organization to accept an option having a given risk function in preference to
another, or in preference to taking no action.
I assume that competent leadership of any organization worth its pay can
make such a decision, at the appropriate level of seniority.
Risk Appetite. The worst (least-preferred) set of probability
distributions of loss magnitudes that the management of an organization is
willing to voluntarily accept in the pursuit of its objectives. The key idea here is voluntariness.
And finally,
to settle the age-old dispute about the difference between risk appetite and
risk tolerance:
Risk Tolerance. The set of least-preferred probability
distributions of loss magnitudes that the management of an organization is
willing to accept when presented with them involuntarily. Risk Tolerance is by definition greater than (includes
more probability distributions of losses) than Risk Appetite. The key proviso here is involuntariness.
I’ve have
more to offer later about notions like attack, attack surface, attack vector,
exploit, flaw, and vulnerability.
No comments:
Post a Comment