At the Open Group meeting in Austin a couple of weeks ago, I attended the workshops on IT risk assessment. Pretty dull, eh? In fact, this topic produced some of the liveliest debate I’ve ever had at a conference.
Unless you specialize in this area, you may think that risk assessment is pretty well sewn-up. You couldn’t be more wrong. Get 50 practitioners in a room and you will have 50 different methodologies for assessing IT risk. The trouble is that nearly all of them will be subjective – the outcome of any risk assessment exercise is most likely to be ‘high’, medium’ or ‘low’. Even when it’s an apparently objective number -- 54,821, for example – you don’t learn all that much. Try going to your board and telling them that their IT risk is 54,821 and their eyes are likely to glaze over very quickly! Any attempt to calculate ‘annual loss expectancy’, although valiant, only results in trouble when the degree of variability is larger than the sum itself!
So we urgently need to deliver a methodology that everyone can use that produces meaningful results – preferably one that can be justified in terms of overall business impact. And yes, there are people who are trying to put some rigor around the subject by using concepts such as Bayesian analysis or Monte Carlo simulations. The International Organization for Standardization (ISO) is also doing work in this area, as is the European Network Information Security Agency (ENISA).
It’s clear that this is still a young and developing subject. IT risk is not like the life insurance industry, which has 300 years of data to draw on. This is cutting-edge work and we need people to contribute their ideas, their data, and their expertise. Symantec is playing a role, through its INFORM program, which is designed to relate IT risk and risk management to business objectives and drivers. We are also hoping that our next IT risk management report will put some objectivity around the subject and demonstrate the reality behind some of the myths that exist regarding IT risk management.