DETECTION
Avoiding or reducing the impact of human made or
human-linked (epidemics) catastrophes like the Great Fire of London of 1666, or
the last cholera epidemics in the same city, and protecting populations against
known and monitored risks, depend on two data streams:
-monitoring and detecting events
in real time
-warning in real time.
The value attached to these is the protection of lives and
assets.
Insurance companies have their methodology to evaluate risks
and acceptable costs to prevent or reduce those risks. They sell insurance
products to individuals and organisations.
Governments have their own policy to manage, eliminate or
mitigate risks. Politicians are judged on their ability to manage risk
avoidance schemes, and when a catastrophe happens, on how they manage a crisis. Tuning the detection scheme pessimistically leads to
overprotecting, and high cost for no added benefit. Tuning the detection scheme optimistically may overlook
risky situations and under-dimension the response scheme. Instead of mathematical models of assumed probability
distribution under this hypothesis, multiple scenarios have to
be considered, and risk must be bounded with a lower and an upper bound,
leading to mathematical inequalities and multiple stochastic models.
Railway companies use a standard model, jointly developed by
them at ISO, where risk is categorised by the potential impact, the highest
being many lives at risk.
They can build on a long history, which has led to safe
railway journeys, with now very few accidents.
Here is a spectacular one of 1895, which claimed only one
life:
http://en.wikipedia.org/wiki/File:Train_wreck_at_Montparnasse_1895.jpg
No comments:
Post a Comment