I am reposting some of my earlier writings. I've had some of them disappear from the face of the net, and I'd like to preserve some that I think have stood the test of time.
This post I made in 1991 to the Risk Digest mailing list. While my focus then was primarily environmental risks, I think it has a lot of relevance today, as we discuss health insurance, which is inherently about personal risk management, and the role of government and corporations, which is about the managing an entirely different kind of risk.
In particular, I think that people on the relative right on this issue perceive the risk of government vs the risk of corporate differently than the people on the left do. Largely, this can't be reduced to "right" vs "wrong", because it is based on individual value systems -- a "pricing model", if you will, for risk of, say, undesired government action. These value systems are, in turn, informed by personal estimates both of the likelihood of misaction, and of the degree of intractability in addressing it.
[Sadly, Phil Agre has gone missing, with hundreds, if not thousands of friends concerned about his wellbeing.]
Risk Assesment High Priesthood
Robert W. Kerns <[email protected]>Tue, 3 Sep 91 22:22 EDTIn regard to Phil Agre's remarks on PR and the Risk Assesement field: I too am
offended by most of what I see in this area. (Note: What I see is a biased
sample, and not representative of all research done in the field). I consider
the field to be seriously contaminated by self-serving interests, to the point
where I don't trust ANY conclusions presented, unless I'm also given the data,
methodology, and chain of reasoning.
The most fundamental flaw in my view, and one I haven't seen discussed except
peripherally, is the assumption that risk can be reduced to comparing numbers
of deaths, with all deaths being equal. Whenever the public at large doesn't
agree with this, "public reaction" is labeled as being irrational. It is the
strong parallel here between religious dogma and Risk Assesement
"professionals", which leads me to term this a High Priesthood. Time and time
again, I see results which are surprising when viewed from a "all-deaths-equal"
viewpoint, used to argue that people's perceptions of risk are flawed.
I KNOW there's value to scientific methodology in analyzing risk, and I would
like to make use of it in forming my opinions. But when most of what I see
masquarading as analysis is contaminated with this religious assumption, I am
really thwarted in using Risk Analysis.
To take one oft-cited example, coal vs nuclear, and twiddle with it a bit to
make what I'm talking about clear. (Numbers invented; I'm illustrating a RA
concept, not comparing coal and nuclear).
Coal: 5 death per 10,000 man-years.
Nuclear: 1 death per 10,000 man-years.
Sounds like Nuclear is the clear winner here, right?
But let's consider a couple scenarios:
1) Coal: Equal geographic distribution of risk over area of benefit.
Nuclear: Risk concentration around the plant.
2) Coal: Constant, predictable rate of deaths.
Nuclear: Low rate, except in rare accident, resulting in very
high cost of health care, entire families wiped out.
The results I've seen indicating that people's perception of risks are "skewed"
to me indicate that people are more adverse to risks with particular
characteristcs:
* Unfair distribution
* High concentration; that is occasional disaster is worse than
continual high risk.
* High subsidiary cost, such as an area of land rendered
uninhabitable, or high health-care costs.
* Low amount of individual control over individual risk factors.
* Risks whose assesements are based on questionable data or from
sources whose veracity is suspect.
To me, this seems to be a much more rational approach than reducing
the debate to numbers killed by coal or nuclear.
From this, you might assume that I am wildly pro-coal and
anti-nuclear, which would exagerate my position considerably.
(I'm anti-coal, and consider chemical waste to be more serious
than nuclear. Nuclear waste decays, but heavy metals are forever!).
I see the same phonomena operating in assesement of air-travel risks, where
people are more concerned with air accidents, which wipe out entire families,
and over which there is little personal control and choice, than over
automobile travel, which is far more dangerous. This concern is viewed as
non-rational, but I find it eminently rational. Killing off and injuring large
numbers of people at once overloads our mechanisms for dealing with tragedy, by
overloading emergency health-care, wiping out large segments of families,
wiping out entire upper-level management of companies, or rock groups, or
what-have-you.
There's more structure to risk than a single scaler value; deaths per 100,000
deaths per {man-hours,passenger-miles, etc.}, and these scalar numbers are not
what society tries to optimize.
Real scientific research would try to go to the next step, and model and
quantify what people really DO try to optimize.
But it's easy to look scientific if you ignore this and say "see, it all
reduces to this number, which scientifically PROVES it."