Phil at itinerant and indigent has an interesting personal security assessment form that some may find useful. It is intended for ex-pat staff but I don’t see why it could be used, with some slight modifications, for national staff.
However, I recommend it with a caveat. It is important that the user of the form and the compiler of the data recognize that there is a difference (sometimes a very large difference) between feeling (un)secure and being (un)secure. People tend to underestimate threats to which they have become familiar and overestimate new threats especially if those threats are vivid and easy to recall.
I especially like question 5:
Describe a few things that could happen over the next two months, that would cause you to review your posting here. Try to be very specific as you describe a threshold that, once crossed, would make you radically reassess being here in Afghanistan
Having people assess their risk cut off level proactively might just help counter some of the cognitive bias that skews risk perception.
You can find the form here. Just scroll down to the bottom of the post.
Something about most UN and NGO security reports has always made me uneasy. Don’t get me wrong. It’s not that they aren’t thorough. A lot of work goes into fact checking and ensuring that what they say is ’correct’. It’s just that the typical security report is a comprehensive list of recent past incidents combined, if we are lucky, with their assessed causes. Incident statistics are then charted and 'trends' are identified. This always made me a little nervous.
To be fair I never really knew why it made me nervous until I read “The Black Swan”. Nicholas Taleb raises several points that help explain my unease.
The first is that more information is not necessarily better. Its very easy to get bogged down in detail that has no real relevance to the issue at hand.
The second factor is what Nicholas calls the Ludic Fallacy. In brief this is the assumption that the unexpected can be predicted by extrapolating from statistics based on past observations. Nicholas argues that while this holds true for theoretical models based on games of chance it seldom holds true in the real world for the following reasons:
∗ We don’t know what we don’t know. (See the Unknown Unknown) ∗ Very small (perhaps imperceptible) changes in the variables can have a huge impact in the outcome. This is commonly referred to as the Butterfly Effect. ∗ Theories based on experience are fundamentally flawed as events that have not occurred before (or are outside living memory) cannot be accounted for.
The Washington Post graphic below, which shows the frequency and lethality of suicide attacks since 1981, illustrates the problem. If we had examined the chart in 2000 would it have led us to predict 9/11(a classic Black Swan)? If we had re-examined it in 2003 would it have led us to predict the sudden increase in the frequency of attacks in 2007? What does 2007 tell us about 2008? Looking at the trend from 1981 to 1989 how many researchers would have concluded that suicide attacks were in decline and opined that such attacks were ineffective in accomplishing the attackers goals.
For a brief period, while I was an analyst, I worked for a General who was inclined to say, “tell me what you know, tell me what you think you know, and tell me what you don’t know”. Of course he missed a category of information. It was what we later came to call the ”unknown unknown”. Nicholas Taleb refers to this category of information as silent evidence. It is the vast body of information that we are not aware of, and even worse, are not aware that we are not aware of it.
Does this matter to the NGO security analyst? Of course! If we fail to acknowledge the existence of silent evidence we fool ourselves into believing we know the world better than we really do. We track incidents and develop models to try and predict the future without thought to how incomplete our models are. Worse, if we are naïve enough to believe our models we unknowingly leave ourselves exposed to future unknown risks.
Lesson Learned: I don’t know as much as I think I do. No matter how much information I have the vast bulk of it, the hidden silent evidence, remains below the surface. From this morass of unseen circumstance can spring forth all manner of unanticipated surprises.
Over the past few weeks I've been reading Nassim Nicholas Taleb's "The Black Swan: The Impact of the Highly Improbable". It has been a very difficult read for me. Not so much because his ideas are complicated... they are, but Taleb explains them very well. No, my difficulty has been that the book challenges, even destroys, ideas that I have long held dear.
I've learned (maybe I should say I'm trying to learn) a lot from Black Swan. Taleb's ideas are changing my view of the nature of knowledge, analysis, and prediction. Over the next few posts I hope to outline some of the lessons that I think NGO security officers can take from this book. It won't be easy and I'm sure that I'll get a lot wrong.
For this post however, I'll take the easy way out. This video clip is of the Taleb himself, explaining the term "Black Swan".
Is the world a more dangerous place to live now than it was ten years ago? How about a hundred years?
According to this first video the answer is yes. In it the University of Hawaii examines the complex issues of armed conflicts, peace-keeping operations and humanitarian relief with the input of former UN and government officials, humanitarian aid workers and PKO experts.
In "A Brief History of Violence" Steven Pinker argues the opposite. His data suggests that we are living in what might very well be the most peaceful time in human history.
So who is right here? Is the world safer? How do our cognitive biases shape our perceptions of the risks we face now versus those faced by our ancestors?
I love maps. Good maps can be a security analyst's best friend. A good map can summarize an entire analytical report.
A recent post on sources and methods led me to Aon Corporation'sTerrorism Threat Map. Risk levels, regions of special risk, religious extremist groups, political extremism, separatist movements, and kidnap risk are all covered in a simple and easy to grasp format. The legends are chock full of information as well. One even contains a concise explanation of the terrorism risk assessment process.
Aon's 2008 Political and Economic Risk Map is another that deserves a place on your office wall. Not only does it illustrate the usual war, terrorism, and civil disturbance risks but it also highlights exposure to the current global credit crisis. You can get a copy here but unfortunately you'll have to fill in one of those annoying online forms.
Privacy International's map of Surveillance Societies Around the World isn't nearly as professional as the ones above but it is still effective at pointing out that the world's nosiest governments aren't necessarily where you might think. Although I think Privacy International tends to be somewhat alarmist my biggest problem with their latest report is that they still leave large portions of the world uncovered. Surely Africa, the Middle east, and South Asia deserve greater attention?
For extra analytical fun try overlaying the maps. How does surveillance intensity compare to terrorism risk? Kidnap risk?
10 Ways We Get the Odds Wrong: Psychology today takes a look at why our brains are so bad at assessing modern risks. There is an interesting if strictly US-centric quiz at the end that will let you test your risk knowledge.
Finding a Job: The AidWorkers Network has a good guide for anyone looking to break into the aid worker job market. Its not limited to security jobs but it doesn't exclude them either.
Travel Safely: Gadling shows you how to create your own DIY personal first aid kit for the road. Note that this kit is for travel related "nuisance illnesses". For field work I carry a larger first aid kit as well.
"Sources and Methods" has a great post on rank ordering the risks from the 2008 Global Risk Report. I'm not really conversant with all the math but his rank ordered list is pretty interesting. Where do you suppose international terrorism ranks compared to a pandemic? Or how about "failed and failing states" compared to "natural catastrophe: earthquake?"
Risk homeostasis theory, developed by Gerald J.S. Wilde, has some serious potentially serious implications for NGO security.
At its core risk homeostasis theory has two basic premises. The first is that every individual has an inbuilt, personal, acceptable risk level that does not readily change. The second premise is that when the level of acceptable risk in one aspect of an individual's life changes there will be a inverse change of acceptable risk elsewhere. In other words everyone has their own risk ‘set point” at which they are comfortable and which they will endeavour to remain at.
In an NGO context it suggests that increased security precautions encourage greater risk taking amongst staff in other areas of their lives. Better vehicles and improved communications would therefore result in staff to pushing the envelope in their field activities. In effect, according to risk homeostasis theory, security measures merely serve to "move risk-taking behaviour around".
Wilde’s book, Target Risk, is full of citations from studies showing that vehicle safety improvements increase risky driving and fail to decrease the accident rate. He also cites examples of industrial safety programs that don’t decrease overall work related injuries and anti-smoking campaigns that come to nothing.
All of this begs the question of whether or not current security programs are, or even can be, effective. Do security officers, security training programs, and improvements in equipment merely shift the risk? Do aid workers compensate for decreased risk by pushing harder and farther than they would otherwise? Should we be concentrating on mitigation rather than risk reduction?