Tuesday, 21 June 2016

Multi-hazards and dominoes at the UR2016 conference

This year the Understanding Risk 2016 Conference in Venice hosted two sessions related to multi-hazards and their interactions:

The domino effect: the future of quantifying compounding events in deltas


The “Domino effect” session integrated short presentations from those researching interactions of hazards (mostly in deltas), with a panel discussion session open to the audience. The discussion section on data issues highlighted the problems of data related to interacting hazards such as the lack of detailed data at an appropriate spatial scale to determine hazard interactions; and that loss data from cascading events are attributed to the primary hazard. The insurance industry was highlighted as a potential source of useful data; it is widely acknowledged that the insurance industry models what they commonly refer to as “secondary perils”. However, it is unclear precisely how they model these interactions, as they are not transparent in their methodology, or what data they potentially have access to (and whether they would be willing to share it). There is also a push for more and better data throughout the disaster community at the moment. This provides an excellent opportunity for the multi-hazards community to identify what data is needed to model and understand the interactions between hazards better, and therefore what data needs to be gathered now to develop our understanding of multi-hazards for the future.

Ways around the lack of data was also discussed – particularly focusing on the practical aspects of understanding multi-hazards for those on the ground. It was questioned whether we need to have all the data to model interactions to be able to make decisions and act on them in the present. For example, civil protection, non-governmental organisations, and communities have to make decisions about multi-hazards on a regular basis. These decisions cannot (and do not) wait for the science to catch up with needs. Therefore, what can we learn from those already dealing with multi-hazards? What do they need to make decisions? And are there any gaps that science can fill without having to collect vast amounts of data?

An agreement from the discussion was that community knowledge is not enough – often potential interactions between hazards are unknown as they go beyond historical or community memory. But similarly, science is currently not enough. A multi-disciplinary approach to multi-hazards is needed to push forward the field, but in a practical and useful way. The session showed promise for future collaborative work on the theme, with an inclusive approach to contributors to the topic, linking together those with a background and experience in the field, with those who have a newer, fresh perspective on the issue. In this way a range of skills and knowledge can be applied to the problem in a systematic way to produce useful outputs for the issue of multi-hazards.


Challenges in developing multi-hazard risk models from local to global scale


The “Multi-hazard risk models” session was opened by Mauro Dolce from the National Civil Protection Department, Italy. He echoed some of the discussion during the “Domino effects” session, highlighting that multi-hazard thinking is already intuitive even if there is a lack of quantitative data, for example, locating displaced people away from flood plains in the aftermath of an earthquake. Echoing the literature, he outlined the difficulties of comparing natural hazards, but was optimistic in using losses as a common measure of comparison between hazards. He said cascading effects are an example of complexity, and they represent a significant challenge in implementing fully multi-hazard risk models. Despite the difficulties, he emphasised that the growing complexity of modern society makes the occurrence of these effects more and more fatal and catastrophic. In conclusion, he promoted the need for the following:
  • Close the gap between science and technology, and decision making
  • Make information open, available, and accessible
  • Develop standards for multi-hazard risk analysis
  • Develop partnerships between natural hazard silos
  • Develop a consistent capability of modelling multi-hazards globally. Including independent, concurrent, and triggered events.
Peter Salamon from the Joint Research Centre emphasised the need to bring a community together to focus on multi-hazards and their interactions and tackle the problem in a systematic and holistic way, including the suggestion of a multi-hazard focus group specifically designed for this purpose.

The discussion from this panel session focused on the possibility of creating a global multi-hazard model with one standardised approach, particularly looking at underlying risk and loss and impact of hazards to synthesise and compare them. The suggested approach seemed to indicate fusing the current global hazard models and organisations that currently exist (e.g. Global Earthquake Model, Global Volcano Model, Global Tsunami Model). It was suggested the Global Earthquake Model’s exposure database could be expanded to cover vulnerability to other hazards. Whilst this is a promising advancement for the multi-hazard field, and an achievable goal in the foreseeable future, there was a lack of discussion on how to include the interactions between hazards in this approach.


Multi-hazard buzz-word

The increase in attention being paid to multi-hazards and their interactions are clear in the number of sessions dealing with these issues at this international event. The increase in attention is likely related to the increased mention of multi-hazards and interacting effects in the 2015-2016 international agreements, such as the Sendai Framework. Whilst this is encouraging and promising for this niche area to be receiving increased attention, there is clear danger in everyone jumping on the newest buzz word without a clear understanding of what the issues are surrounding interacting hazards, and a thorough awareness of what has already been done in this area. As such there is a risk of duplicating work and thinking that has already taken place around multi-hazard issues, rather than pushing the sector forward with the added momentum and interest it has recently received.





N/B I was on the panel discussion for the “Domino effect” session, focusing on the data issues of interacting hazards. As such, this blog summary does not include all the issues discussed during the panel session, as I was unable to take notes.

Saturday, 4 July 2015

Defining "cascading disasters" and "cascading effects"

A common difficulty in researching multi-hazards is that the terminology used in the literature varies greatly, and a standard definition of multi-hazards, cascading hazards etc does not seem to exist. A recent journal article has been published attempting to do just that: define what the term "cascading disasters" and "cascading effects" means.

The paper "A definition of cascading disasters and cascading effects: Going beyond the "toppling dominos" metaphor" by Gianluca Pescaroli and David Alexander first look into the ways in which the terms are used in the present literature. The paper then goes on to dig a little deeper into the drivers that tend to distinguish the phenomena. They conclude with proposing definitions for "cascading effects" and "cascading disasters" to be used going forward when referring to these situations. I include them here as extracts from the full paper, which can be freely obtained here.

"Cascading effects are the dynamics present in disasters, in which the impact of a physical event or the development of an initial technological or human failure generates a sequence of events in human subsystems that result in physical, social or economic disruption. Thus, an initial impact can trigger other phenomena that lead to consequences with significant magnitudes. Cascading effects are complex and multi-dimensional and evolve constantly over time. They are associated more with the magnitude of vulnerability than with that of hazards. Low-level hazards can generate broad chain effects if vulnerabilities are widespread in the system or not addressed properly in sub-systems. For these reasons, it is possible to isolate the elements of the chain and see them as individual (subsystem) disasters in their own right. In particular, cascading effects can interact with the secondary or intangible effects of disasters." - Pescaroli and Alexander, 2015

"Cascading disasters are extreme events, in which cascading effects increase in progression over time and generate unexpected secondary events of strong impact. These tend to be at least as serious as the original event, and to contribute significantly to the overall duration of the disaster’s effects. These subsequent and unanticipated crises can be exacerbated by the failure of physical structures, and the social functions that depend on them, including critical facilities, or by the inadequacy of disaster mitigation strategies, such as evacuation procedures, land use planning and emergency management strategies. Cascading disasters tend to highlight unresolved vulnerabilities in human society. In cascading disasters one or more secondary events can be identified and distinguished from the original source of disaster." - Pescaroli and Alexander, 2015


I very much appreciate and celebrate that this paper has been published (it would have made things a lot easier to have this resource at the beginning of my PhD). There does, however, appear to remain a gap in defining or exploring "cascading hazards", in that the two definitions proposed in the paper include the influence the human or social world has upon disasters or the effects following an initial natural hazard event. This in itself is valuable, yet I would have thought defining "cascading hazards" - the more physical or natural side of the issue - would have been the first place to begin.

Perhaps the authors thought that cascading hazards and the domino or triggering effect had been covered enough in the literature - after all, the title suggests in intention to go beyond the triggering chain of hazards. However, I still think there is a gap in the literature to deal with cascading hazards alone. Indeed, the issues and implications discussed in the paper relating to cascading disasters (such as amplification and interdependencies) are the same or similar issues related to cascading hazards. I guess I will have to wait eagerly for someone to write such a paper (or write one myself).

All-in-all, the paper is likely to prove valuable to the multi-hazard research field. It's value is in the first real attempt to label cascading disasters and cascading effects. Whether the definition becomes more refined or refuted as time goes on, it at least provides a starting point from which to build upon.


Pescaroli, G, and Alexander, D, 2015, A definition of cascading disasters and cascading effects: Going beyond the "toppling dominos" metaphor, Global Risk Forum, GRF Davos Planet@Risk, Volume 3, Number 1, Special Issue on the 5th IDRC Davos 2014, March 2015.

Tuesday, 19 August 2014

CascEff Project - the Role of Media in a Crisis: Leicester University

Leicester University is currently researching the role of media in the aftermath of crisis situations. The project comes under the European Commission's CascEff project. I am looking forward to seeing the results of the study when it is published.
The following is an extract from Leicester University's press pages:
Leicester researchers are examining how news coverage and social media activity during some of the largest disasters in recent years can help decision makers and incident managers prepare for future crisis situations.
Researchers at the University of Leicester’s Department of Media and Communication are contributing to a European Commission-funded project called “CascEff: Modelling of dependencies and cascading effects for emergency management in crisis situations”.
The project will examine the “cascading” effects of both natural and human disasters – where an initial incident can snowball, potentially threatening lives, property and the environment across large areas.
The Leicester researchers are looking specifically at how disasters are framed in both the mainstream media and by members of the public on sites like Twitter as events unfold.
They hope to identify examples of good practice for information dissemination to the public during crises. These will be used to develop a communication strategy for emergency services and incident managers to aid their preparation for future disaster events.
The researchers will examine: natural catastrophes, such as earthquakes and flooding; fires in buildings and tunnels; and outdoor events  such as pop concerts.
Examples may include the Tohoku earthquake and tsunami in Japan in 2011, the effect of hurricane Sandy on New York City in 2012 and the floods in South-West England last winter.
They also plan to look at news coverage of several earlier disasters, including the North Sea flood of 1953, the Sandoz fire and chemical spill in 1986 in Switzerland, and the fires which have broken out in the Channel Tunnel since its opening in 1994.
Dr Paul Reilly of the University’s Department of Media and Communication is leading the Leicester project. He said: “We hope this research will help emergency services in the planning of their communications strategies during crises.
“We will look at both social media and news media during these events, and how they can be used by incident managers to provide accurate, real-time information to members of the public. Clearly the immediacy of social media may have advantages and disadvantages for those involved in managing such situations.  Our focus will be on how key stakeholders can harness these tools to correct rumours and false information and to minimise the cascading effects of these incidents.
“The project will result in a report on the role of the media and the information flows that emerge during crisis situations, as well as a communication strategy which is intended to help decision-makers during future crisis situations.”
The Leicester team will consist of Dr Reilly and Research Associate Dimitrinka Atanasova, with this part of the project due to finish in December 2015.
The CascEff project also includes: SP Technical Research Institute of Sweden; Lund University, Sweden; the Swedish Civil Contingencies Agency; Ghent University, Belgium; INERIS (the French National Institute for Industrial Environment and Risks); KCCE (the Belgian Federal Centre for Civil Security); Safety Centre Europe; Université de Lorraine; Northamptonshire Fire and Rescue Service; and E-Semble, a Netherlands-based firm which makes simulation software for training safety professionals.

Thursday, 31 July 2014

CRUST: Cascading Risk and Uncertainty assessment of earthquake Shaking and Tsunami

Bristol University's Faculty of Engineering have developed an interesting project on earthquakes and triggered tsunamis. The project is called CRUST: Cascading Risk and Uncertainty assessment of earthquake Shaking and Tsunami. I am looking forward to following the research outputs from this project.


The following extract is from Bristol University's Faculty news pages:




Building resilient infrastructure/communities against extremely large earthquakes is a global and urgent problem in active seismic regions. Economic consequences of natural catastrophes have become so devastating, reaching hundreds of billions of pounds in loss, and numerous mega-thrust events are anticipated to occur near vulnerable megacities around the world. The coordination on multiple, inter-related geophysical hazards (e.g. ground shaking and tsunami), analyses of which have been historically undertaken in a disintegrated manner, is needed. Although uncertainty is ubiquitous in natural hazards, treatment of uncertainty in risk assessment is fragmented. Improving the scientific understanding of hazard processes is crucial to better risk forecasting.


CRUST (Cascading Risk and Uncertainty assessment of earthquake Shaking and Tsunami) tackles the global challenge of modelling cascading hazards due to mega-thrust subduction earthquakes by developing a novel methodology for multi-hazards risk assessment from a holistic standpoint and by promoting dynamic and informed decision-making processes for catastrophe risk management. The scientific innovation of the CRUST project lies in a coherent treatment of risk and uncertainty related to compounding risks due to mainshock ground shaking, massive tsunami, and prolific aftershocks acting on coastal infrastructure. Creating a blueprint of the methodology and demonstrating it for several seismic regions are the goals of this project.


Specifically, the research objectives of CRUST are fivefold: (1) to develop an integrated multi-hazards impact assessment methodology for cascading earthquake-related phenomena (i.e. mainshock followed by tsunami and multiple aftershocks); (2) to characterise earthquake slips for future mega-thrust earthquakes as random field, and to evaluate the impact of uncertain slips on strong motion and tsunami simulations; (3) to model a sequence of mainshock-aftershock earthquake records based on actual observations, and to assess their combined effects on nonlinear structural response; (4) to model off-shore tsunami generation and propagation, to characterise tsunami fragility based on numerical simulations, and to validate these with a unique set of experimental data and field observations for the 2011 Tohoku earthquake and tsunami; and (5) to develop practice-oriented engineering guidelines and tools for multi-hazards impact assessment, and to demonstrate their capabilities by applying them to other subduction zones, such as the Hikurangi (New Zealand) and Cascadia (Canada) zones.

United Nations "Multi-hazard Disaster Risk Assessment" Conference

There is an upcoming conference on multi-hazards being held in China (15/09/2014 - 17/09/2014): United Nations International Conference on Space-based Technologies for Disaster Management "Multi-hazard Disaster Risk Assessment".


The following is an extract from the conference website:




Introduction



UN-SPIDER is the United Nations Platform for Space-based information for Disaster Management and Emergency Response, a programme implemented by the United Nations Office for Outer Space Affairs (UNOOSA). The UN-SPIDER Beijing Office is pleased to announce the “United Nations International Conference on Space-based Technologies for Disaster Management - "Multi-hazard Disaster Risk Assessment" from 15 to 17 September 2014.

The UN-SPIDER Beijing Office has successfully organised three conferences since 2011. Previous conferences covered the themes of “Best Practices for Risk Reduction and Rapid Response mapping” in 2011, “Risk Assessment in the context of global climate change” in 2012 and “Disaster risk identification, assessment and monitoring” in 2013. These conferences offered a forum for disaster management communities and experts to strengthen their capabilities in using space based information to identify, assess, monitor and respond to disaster risks and integrate space technology into long-term disaster risk management efforts.

 

Rationale

Recent disasters around the world have highlighted shortfalls in efforts of the governments and communities, including development partners, in reducing disaster risks. Although early warnings of hydrologic hazards (floods, storm surges, coastal erosion and droughts) and meteorological hazards (cyclones, tornadoes, windstorms etc.) are able to save human lives in some cases, the economic and environmental losses are often huge and recovery will usually take years to normalize. Therefore, countries need to have an increasing focus on economic, environmental and human costs of disasters and develop approaches to lessen the risks and reduce loss of lives and property.

All the elements of disaster risk are spatial in nature. Earth observation and geospatial data provide critical information on elements of risk delivered in the form of maps. These help in predicting and identifying risks more accurately as well as planning responses in a timely manner when they degenerate into a disaster.

Multi-hazard risks give an indication of the overall risk posed to a community. Multi-hazard approaches are valuable in providing an overview of the overall risk and thus enhancing effective planning countermeasures. Such approaches avoid enhancing further risks in the attempt to reducing already existing ones. The purpose of this conference is therefore to promote the role of space-based and geospatial information in a multi-hazard disaster risk assessment. It seeks to bring together experts and end-users to a single platform to ensure that space-based information is effectively employed in decision-making towards saving lives and reducing economic losses.

Conference Sessions

The conference will cover the following topics:


Session 1: Disaster Risk Management and Space-based information: This session will discuss experiences and good practices of disaster risk management at different levels, with a focus on the role and contribution of space-based information.


Session 2: Approach and methodology in using space based information in multi-hazard identification and risk assessment This session will discuss the applied research and development on the approaches, models, methodologies, tools, service platforms and operational projects related to multi-hazard identification and disaster risk assessment.


Session 3: Space-based information resources for hazard identification and risk assessment This session will discuss the space-based information advances in remote sensing data, information products, software used for multi-hazard monitoring, data visualization and data dissemination tools for disaster risk assessment.


Session 4: Space-based information for damage and loss estimation This session will discuss the methods and present case studies demonstrating the use of space–based information for disaster damage and loss assessment. This session aims to extend the scope of space-based information beyond emergency mapping, providing valuable information in damage and loss assessment.


Session 5: Networking and engagement with the UN-SPIDER network This session will aim at promoting the engagement of Member States and partner organisations with the UN-SPIDER Pprogramme. The session will discuss best practices of using space-based information and the impacts of the technical advisory support offered through the UN-SPIDER Programme.


Working groups

Working groups will be organised to discuss the cooperation related to disaster risk reduction mapping services and products, information sharing and cooperation projects in this area. The working groups will develop guiding points on ‘drought monitoring and risk assessment’ at the national level.

 

Target Audience

Disaster managers, policy makers, providers of space technology solutions/tools/applications from governments, academia, research, NGO and corporate sector. Number of expected participants: 120

 

How to apply and application deadline

The final deadline for registration was 29 June 2014.

Monday, 17 March 2014

Westwood Earthquake: Risk of Future Earthquakes-and-Landslides

At 13:25 UTC today on Monday 17th March a M4.4 earthquake occurred 9km NNW of Westwood, California, near to the city of Los Angeles. Whilst luckily no one was hurt, the earthquake serves as a reminder for those living within the region of the threat posed by the seismic faults beneath them. It is unknown or unreported at the moment whether any landslides or secondary hazards were triggered by the event. The epicentre of the earthquake is fairly close in proximity to the epicentre of the 1994 M6.7 Northridge earthquake which caused significant damage.


I am currently on holiday in San Diego at the moment, and hearing of the earthquake so close to me (in comparison to when I was in the UK) has prompted this blog post. My current research into earthquake triggered landslides has used the Northridge 1994 earthquake as a case study, investigating 'what-if' scenarios simulating the potential impact of such an event occurring in the present day. Unfortunately, I do not have my data or sources with me on holiday. But, I wanted to capture some of my thoughts on the event as soon as possible. I hope to update this post when I return to the UK for clarification and further details.

The area affected has been in a recent earthquake 'drought' with few earthquakes of significant magnitude occurring in recent years. The USGS suggest there is a potential for the recent M4.4 earthquake to be followed by another earthquake in the near future. Indeed, the entire region of Los Angeles is at risk of high magnitude earthquakes. The residents of Westwood and surrounding region are lucky the earthquake today was not of a higher magnitude and I can only hope this serves as a reminder for all those living in the region of the potential severity a bigger earthquake could cause.

There are currently no reports of landsliding in the area as a result of the earthquake. With a M4.0 the typically accepted threshold for triggering landslides, perhaps we may hear reports of landsliding coming in soon for more remote locations. The 1994 Northridge earthquake triggered tens of thousands of landslides in the surrounding region. Luckily, these landslides had minimal interaction with the population and infrastructure. However, where a landslide did affect a building, the damage from said landslide was approximately three times greater than the average damage caused by the earthquake shaking to similar buildings. Since 1994, there has been substantial development in Los Angeles, the San Fernando Velley and in the hillier areas surrounding Northridge.

Whilst I cannot go into specifics of the case study simulations I have run due to publication constraints, I can report that initial results suggest that even if the same earthquake occurred tomorrow as occurred in 1994 at Northridge, the potential impact of such an event would be greater than what was experienced in 1994. Even a smaller magnitude earthquake of, say, M6.0 could potentially cause as much damage as the M6.7 Northridge 1994 event, if it were to occur tomorrow. I will be able to give such statistics when/if the paper is accepted for publication in the near future.

The USGS PAGER estimates of losses due to secondary hazards states for today's event: "Recent earthquakes in this area have caused secondary hazards such as landslides and liquefaction that might have contributed to losses." This highlights a limitation of the PAGER loss estimates - the effect of secondary hazards are not currently accounted for in the report. This is an area the USGS is currently developing to be able to aid assessments of potential damage from the main shaking and secondary hazards for use in the future. Whilst secondary hazards such as landsliding do not always occur as a result of an earthquake, and in most cases, the majority of earthquake damages are caused by shaking, there are enough cases where the triggered landslides have caused a significant proportion of the damage to warrant concern. In some exceptional events, landsliding has caused the majority of losses as a result of an earthquake trigger. Development of a predictive tool for assessing landslide and liquefaction hazard and losses would be incredibly useful for emergency responders to get a more accurate picture of the disaster and locate areas of secondary hazard damage that may not be accounted for by earthquake shaking alone.

Whilst further research is required into the potential impact of secondary hazards, and the Southern California region provides an almost unprecedented amount of research and data to be of use in this area, I have respect and faith in the USGS and California research expertise in the area. From what I have learnt, California is one of the most regulated and highly prepared states for earthquake risk. By being located on active fault planes with high potential to experience the 'Big One' in multiple cities in the state, the officials and researchers have been forced to up their game. Whilst I hope that the region does not experience a high magnitude earthquake, the reality is that they are going to have one occur.

I hope the M4.4 experienced today has provided a timely reminder to all those living there of what they could face, and causes them to prepare and be able to respond appropriately if and when a future big earthquake comes their way.

Friday, 6 December 2013

Interacting UK Hazards - Impacts and Origins PhD

Loughborough University has recently advertised for a PhD opportunity: "Interacting UK Hazards - Impacts and Origins". It's great to see the field and funding is starting to get on board with multiple and interacting natural hazards!


Interacting UK Hazards – Impacts and Origins

Dr John Hillier, Geography, Loughborough
Dr Gregor Leckebusch, School GEES, Birmingham
Dr Kate Royse, British Geological Survey

Summary:
An excellent, inquisitive and highly-numerate student is sought to combine novel and industry-based GIS methods (i.e., catastrophe modelling) to understand the origins and impacts of interacting hazards as they afflict the UK.

Background:
The UK is affected by several natural hazards (e.g., floods in 2007). These are currently considered independently, but they could interact. A pilot study by the supervisors, using a novel way of examining past data, has robustly shown that interactions can alter likely ‘worst case’ losses by ~£50 million. This is of immediate interest to insurance companies and with much potential to contribute to policy making about the resilience of the UK as climate changes.

Objectives & Methodology:
A core of the work is low risk, building directly upon the pilot study, but scope exists for a student to innovate and excel. A key objective is to understand the origin of the interaction between shrink-swell subsidence losses for clay soils and other risks. This will be done by relating loss data (Zurich Insurance) to recorded weather patterns and developing published work linking subsidence and climate using British Geological Survey (BGS) data (e.g., GeoSure). The strength of interaction between physical processes required to explain the observed impacts will be quantified by generating catastrophe models [e.g., Donat & Leckebusch, 2011; Royse & Hillier In Press] (new QuickCat code). ‘Catastrophe modelling’ is relatively little used in academia, giving potential for exciting developments, and the last stage of this project is a new use for the technique.

Employability:
A secondment to Zurich Insurance Plc. (3-6 months) has been negotiated, and engagement with the BGS is anticipated. Catastrophe modelling underpins all financial risk assessment, and is becoming critical in Disaster Risk Reduction and humanitarian efforts, ideally placing the student for a range of careers. Training will include fieldwork, integrated modelling, GIS, and relevant programming giving the student skills identified as ‘most wanted’ for environmental jobs; ‘modelling’, ‘multi-disciplinarity’, ‘risk and uncertainty’.