Prompt 3: Has your research contributed to innovative data collection strategies? If so, please elaborate.


Natalie Baker, Sam Houston State University
Magdalena Denham, Sam Houston State University

This research is a combination of traditional ethnography, autoethnography, and participatory action research. We also co-construct our findings together in what we are calling co-autoethnography. This is an entirely new approach to emic research, especially in the field of disaster studies. We chart the benefits and challenges of such work in our presentation. 


Katherine Browne, Colorado State University

We adapt decision-making methods commonly used in agricultural literature by using complexity theory to treat decisions as complex assemblages of factors. This approach allows us to explain the process of disaster recovery as a constellation of influences that shape decisions of survivors as they navigate social and biological interdependencies and risk. This form of data gathering offers a productive and innovative way to identify decision influences situated within stories of recovery and overall well-being. In October, November, January, March, and June 2018, researchers made fieldwork trips to the area, which includes six counties ranging from coastal to inland, as far west as Rockport and as far east as the Beaumont area east of Houston. This research documents a wide range of impact scenarios and considers how variations in geography and risk features interact with these decisions to impact recovery outcomes and well-being.


Paul Chakalian, Arizona State University
Liza Kurtz, Arizona State University
David Hondula, Arizona State University

Due to the fast deployment and low-cost nature of our study, we combined open- and closed-ended measurements in order to maximize both the fidelity and the utility of our data. We specifically employed two novel methods that enhanced our work. First, we imputed interview insights into mobile survey software while conducting the interviews “live” so that key information could be immediately accessed in tabular form following each conversation. Second, we administered closed-ended, self-reported psychological questionnaires before every interview. The use of psychometric measures in hazards research is fundamentally new, and may allow us to better understand the specific mechanistic pathways that lead from vulnerability indicators (e.g., socioeconomic status or age) to negative outcomes, through psychological disposition. Better understanding these vulnerability pathways, as well as sociologically how structural variables influence psychology and agency to create disparate outcomes, is a looming research need.


Geoffrey Chua, Nanyang Technological University
Wee-Kiat Lim, Nanyang Technological University

Our research is a collaboration between operations management and organizational and institutional researchers, which leads to a novel interplay of data collection methods, both qualitative and quantitative. In the field, interviews and field observations—methods more associated with qualitative and organizational and institutional studies—elicit the logic, motivation, and context that provide the building blocks (e.g., business rules) for the optimization models and simulations that quantitative methods typically use in operations management studies. In turn, the results become the means for us to re-engage practitioners for more in-depth probes and fieldwork. This leads us to sample more expert practitioners, access more diverse sites (e.g., blood banks, clinics, and hospitals at national, regional, and community levels), and create more granular archives (e.g., operational data and reports of a blood bank). This iterative cycle helps us build a richer picture of the blood donation and management system of a specific locale.


John Cross, University of Wisconsin Oshkosh

Our data collection technique is somewhat innovative in the survey of emergency management directors regarding the use of historical markers and monuments to communicate hazard risk information, in that it couples a relatively standard survey with the provision of information regarding how other communities disseminate hazards information. Thus, the survey serves to provide the researcher with data, while simultaneously educating the survey respondents regarding an approach to hazard risk communication that has been overlooked in many jurisdictions.


Ziqiang Han, Shandong University

This presentation introduces data collection procedures and efforts linking disaster to higher education and service learning in our teaching and research practices. Service learning is an education method used in professional development and training in colleges and universities. It allows students to apply their knowledge and skills learned from the university in the services settings and helps students build their sense of civic responsibilities, with the enrichment of the capacity of communities. In this study, we developed an undergraduate course named “disaster service learning and nonprofit leadership development.” With the mentors’ help, the students and the college worked together with local nongovernmental and government agencies to conduct disaster education, service, and research in the selected communities. Data collected through these processes were used to write research papers. Moreover, risk reduction solutions and resources were provided to local communities to reduce risks and to increase their emergency preparedness and resilience.


Navid Jafari, Louisiana State University
Anand Puppala, University of Texas at Arlington
Surya Congress, University of Texas at Arlington
Murad Nazari, Louisiana State University
Jack Cadigan, Louisiana State University

Our work responds to innovative data collection strategies for waste debris collection and disposal after extreme events. The current method for estimating waste debris volumes is purely qualitative (e.g., based on visual observations of an individual peering into a truck bed). This data is stored electronically by monitoring consultants. Landfills can also measure the tonnage of incoming trucks via printed tickets, but this rarely occurs because of the overwhelming number of incoming trucks. The data collection implemented through a National Science Foundation RAPID grant aimed to transform post-disaster management by exploring the use of smartphones and drones to automate waste volume quantification. Through image processing algorithms, the images are stitched together via photogrammetry and scaled using a reference object, thereby resulting in a three-dimensional rendition and an estimate of waste volume. These methods were implemented and compared in Beaumont, Texas, after Hurricane Harvey.


Navid Jafari, Louisiana State University
Xin Li, Louisiana State University
Qin Chen, Northeastern University

The current data available for hydrologic modeling include stream gauges and high water marks obtained after floodwaters have receded. However, high-water levels only provide one elevation, and the corresponding time is usually not available, while stream gauges are typically sparse and away from the flooded streets and residential areas. As a result, there is a need for a more spatially concentrated database of continuous temporal measurements of water levels, in order to develop more accurate flood hydrographs. This dataset exists in the form of images and video footage from traffic intersections and interstate highway cameras, major news media outlets, and social media, and can be used to develop a time history of flood inundation in the Houston metropolitan area. In particular, accessing traffic cameras and video represents a major step forward in flood inundation mapping in urban areas because intersection cameras are continuously photographing the rising and falling water levels.


Yen-Lien Kuo, National Cheng Kung University
Hsin-En Liu, National Cheng Kung University

Our research showed that official and nongovernmental rehabilitation assistance had very limited effects on subjective post-disaster recovery. Traditionally, using housing or reconstruction as an indicator of recovery may mislead the speed of recovery. In order to reveal the real recovery status, a person's life satisfaction can be a good indicator, but it requires an in-person survey. According to the literature, life satisfaction is related to a variety of factors, such as health and income. In Taiwan, since national health insurance coverage and access to utilities are high, victims’ physiological diseases and mental illness can be revealed by health insurance claims and utility consumption in residential areas. Electricity consumption, traffic, and telecommunication in business areas are also closely related to local economic activity. These data may not be as good as data from a survey, but will provide real-time recovery status.


Gina Lane, Texas A&M University
Nathanael Rosenheim, Texas A&M University
Walter Peacock, Texas A&M University

Food banks play a critical role in post-disaster recovery. However, little data has been collected on how these agencies function after a major disaster. Food banks function as food distribution warehouses and provide food to charitable agencies that operate food pantries or soup kitchens. In order to collect data on how a regional food bank’s operations were impacted by Hurricane Harvey, innovative strategies were developed with support from a National Science Foundation RAPID grant. First, the food bank distribution data was obtained from Primarius food bank management solutions, a food bank-specific inventory management system. Second, a focus group with food aid agency and food bank managers provided valuable insight. Finally, a food aid agency survey was implemented with Qualtrics research software. The importance of a strong relationship with food bank managers prior to Hurricane Harvey played a critical part in the data collection process.


John McClure, Victoria University of Wellington
Liv Henrich, Victoria University of Wellington
Caitlin McCrae, Victoria University of Wellington
Caspian Leah, Victoria University of Wellington

This research uses novel data collection strategies in utilizing local government council data on earthquake prone buildings and voluntary earthquake checks to track the rates of retrofitting these buildings following recent major earthquakes. These archival data indicate whether people are remediating these buildings well before the deadline required by legislation, and the data suggest that the experience of earthquakes is driving social norms that add to the effects of legislation. They also show that this effect is absent for dwellings that are not covered by this legislation. We know of no other research that has used this methodology to examine patterns of actions leading to earthquake resilience.\r\n


Scott Miles, University of Washington
Troy Tanner, University of Washington

University of Washington students conducted interviews of reconnaissance researchers and analyzed many past disaster reconnaissance reports to synthesize and understand user experiences related to field data collection. Students ideated based on their research to produce a suite of low-fidelity app prototypes. Subsequently, the National Science Foundation RAPID Facility convened a workshop attended by over 70 reconnaissance researchers. These future users participated in student-designed, interactive activities to provide more targeted user experience insights, as well as app design ideas. Over 1,500 sticky notes produced by the workshop were analyzed to inform procurement of facility equipment, as well as additional iterations of a high-fidelity RApp prototype, which is now informing implementation. A resultant core design principal for the RApp is that all data types be treated equally. Meeting this design principle will facilitate users to adapt RApp to existing and new workflows to achieve respective disciplinary and interdisciplinary objectives.


Adelle Monteblanco, University of Texas at El Paso
Jennifer Vanos, University of California, San Diego
Sarah LeRoy, University of Arizona
Patricia Juárez-Carrillo, University of Texas at El Paso
Gregg Garfin, University of Arizona
Hunter Jones, National Oceanic and Atmospheric Administration

This project broadly aims to increase resilience to the public health risks of extreme heat episodes in the United States—Mexico border region. This border region, including El Paso, Texas, is expected to confront hotter, longer, and more frequent heat waves in future years. We focus resilience efforts on particularly vulnerable subgroups—pregnant women and fetuses. To offer an important leap in awareness and preparedness for these vulnerable subgroups, our outreach strategies targeted frontline, trusted maternal health providers—midwives and doulas. We offer the first examination of maternal health providers’ perception and knowledge of the impact of extreme heat on their patients’ birth outcomes. And we report on the deployment of a free “Heat Exposure and Maternal Health” workshop; through bilingual preparedness materials, this workshop helps midwives and doulas communicate heat dangers to pregnant patients and share resources to reduce exposure. 


Walter Peacock, Texas A&M University
Nathanael Rosenheim, Texas A&M University
John van de Lindt, Colorado State University
Judith Mitrani-Reiser, National Institute for Standards and Technology

While there are some examples of interdisciplinary field research, such as National Oceanic and Atmospheric Administration rapid response teams that go into the field after weather events, and post-earthquake research teams sent out by the Earthquake Engineering Research Institute, few of these teams have attempted to gather data employing systematic random samples in the communities or areas under study. Random sampling strategies better ensure that the data gathered are representative of both the population under study and the nature of the event’s impact on the built and social environments. This paper presents a strategy that makes use of census and cadastral data, along with Google Street View and Google My Maps, to develop and implement a sampling strategy in a rapid post-disaster field study to capture flooding impacts to households, housing, and critical infrastructure (e.g., electric power network, water, gas). Interdisciplinary field team development, logistics, and operations are also discussed. 


Margaret Reams, Louisiana State University
Ryan Kirby, Louisiana State University
Nina Lam, Louisiana State University
Michelle Meyer, Louisiana State University

The study provides an innovative "tweet analysis' as a new empirical measure of bidirectional communication and information-sharing asymmetries among the public and organizations/agencies in the field of emergency management.> Our interdisciplinary team  used 14 million tweets during Hurricane Sandy to develop this measure. The tweet analysis seeks to understand communication gaps among individuals and organizations, such as the central and local/regional chapters of the American Red Cross or the central or regional offices of the Federal Emergency Management Agency in the time frame of the purchased tweets. The results of this analysis will be visualized temporally to illustrate any communication gap in the frequency of tweets that are sent from members of the public and responded to by the agency or organization. A larger asymmetry or difference in the number of tweets going into the agency and those going out in response would suggest lower levels of bidirectional communication. This measure could then be used in statistical analyses as a potential influence on recovery outcome measures following disasters.


David Roueche, Auburn University
Frank Lombardo, University of Illinois at Urbana-Champaign
Rich Krupar, Berkshire-Hathaway
Daniel Smith, James Cook University

An innovative component of our research was the use of a data collection platform that provided a customizable, multi-scale tool for rapidly collecting, curating, processing, and archiving damage observations joined to critical metadata. A key component of this platform was a smartphone app that combined multiple data types (e.g., geotagged photographs, videos, and text-based damage assessment forms) into a single geotagged record for each individual assessed structure. All data collected in the field was synced to a web server, where a web-based platform provided an interactive tool for performing quality control checks on the field data and supplementing with additional metadata. The complete database can be easily exported into common geographic information system file formats for permanent archival into platforms, such as the National Science Foundation’s Natural Hazards Engineering Research Infrastructure DesignSafe cyberinfrastructure. The approach demonstrates a framework for enhancing the data collection and analysis capabilities of rapid post-disaster assessments.


Andrew Rumbach, University of Colorado Denver
Esther Sullivan, University of Colorado Denver
Mayra Gonzales, University of Colorado Denver
Sebastian Montenegro, University of Colorado Denver

Data collection on mobile home parks is challenging because states and jurisdictions do not typically keep an inventory of where mobile home parks are located or how many housing units they contain. To document the damage to mobile home parks in the Houston Metropolitan Statistical Area, we developed a unique methodology for inventorying existing parks at the time of the hurricane and estimating their exposure. Our methodology combines land use records, remotely sensed imagery, census data, Federal Emergency Management Agency grant information, and field reconnaissance visits. The result is the first comprehensive inventory of mobile home parks for a major American city, and the only analysis of Hurricane Harvey damage that focuses specifically on mobile home park housing. This methodology could be replicated for future studies, greatly enhancing our ability to study this important housing type.  


Roxane Cohen Silver, University of California, Irvine

Designing and implementing research on collective traumas (e.g., natural disasters) requires overcoming formidable scientific and logistical challenges resulting from the fundamental unpredictability of these events. Previous studies that have examined adjustment to community disasters usually involve asking respondents to recall events long after they have occurred, making it difficult to disambiguate the effects of trauma on subsequent adjustment. Drawing more solid conclusions about long-term disaster response requires having a large sample in which information about pre-event mental and physical health, and baseline assessments of psychological responses, have been collected during the acute period. The investigators in this project have been at the forefront in developing and maintaining a post-disaster longitudinal sample and were able, with National Science Foundation support and collaboration with our Institutional Review Board, to rapidly conduct data collection in advance of Hurricane Irma making landfall in Florida. Careful planning ensured high retention of the sample to enable repeated assessment of post-hurricane responses. 


Robert Soden, University of Colorado Boulder
Austin Lord, Cornell University

Our research relied on qualitative, open-ended interviews that incorporated elements of participatory mapping with residents of the Langtang Valley in Nepal. During the April 2015 earthquake, Langtang was hit by a major co-seismic landslide, the impacts of which the government damage assessment was unable to describe. Our work builds on scholarship in critical cartography that undermines the notion of maps as neutral or unbiased representations of space, often controlled by centers of power and authoritative knowledge. We explore alternate possibilities of what maps can be or do; in particular, we investigate how maps serve as methodological tools in rapid reconnaissance research. How do research participants appropriate maps and mapping to communicate or generate new knowledge? How can the practice of mapping serve to connect participants and researchers during reconnaissance work in post-disaster settings? This work offers both a caution to uncritical adoption of quantitative depictions of disaster impact and a path toward new alternatives.



Elaina Sutley, University of Kansas
Tori Tomiczek, United States Naval Academy
Maria Dillard, National Institute of Standards and Technology
Maria Koliou, Texas A&M University
Derya Deniz, Ozyegin University
Sara Hamideh, Iowa State University
John van de Lindt, Colorado State University
Judith Mitrani-Resier, National Institute of Standards and Technology
Andre Barbosa, Oregon State University
Maria Watson, Texas A&M University
Yu Xiao, Portland State University
Jennifer Helgeson, National Institute of Standards and Technology

The first wave of our data collection was conducted approximately one month after flooding in Lumberton, North Carolina. Detailed, component-based engineering damage evaluations classified initial damage to residential structures. Evaluations were based on observed exterior conditions and housing cleanup debris material left on the street. Due to the degree of population dislocation, many units were unoccupied, preventing the team from assessing interior damage and confirming high water marks. The second wave, performed approximately one year later, surveyed the same households. Survey respondents were asked about water height inside the home, damage to the home and belongings, and repairs to restore structural and nonstructural building components. This approach was conducted for homes and businesses during the second wave. It was observed that knowledge of the interior was critical for accurate flood-induced damage assessment and classification. In addition, the new data underscored the need for longitudinal field studies for holistic reconnaissance.


Joseph Tuccillo, University of Colorado Boulder
Seth Spielman, University of Colorado Boulder

This research overcomes a persistent conflict between spatial and demographic resolution in social vulnerability assessment. While Public Use Microdata Sample (PUMS) data features high demographic resolution useful for understanding how individuals could be impacted by a hazard, its coarse spatial resolution is a limiting factor: Each response is linked to a Public Use Microdata Area (PUMA) with roughly 100,000 people. It is often impossible to infer how small-scale hazard processes could affect individuals in a PUMA. However, using statistical geo-location techniques (i.e., microsimulation) to match microdata to small areas, we generate a novel data source for hazards assessment with both high spatial and demographic resolution. In it, each person/household has a profile of PUMS traits relatable to specific forms of sensitivity and adaptive capacity. Using such information, specific needs for hazard mitigation, response, and recovery can be targeted towards individuals, as well as used in aggregate, to target key population groups in need of assistance.


Joseph Wartman, University of Washington
Jeff Berman, University of Washington
Mike Olsen, Oregon State University
Jennifer Irish, Virginia Tech
Scott Miles, University of Washington
Laura Lowes, University of Washington
Kurt Gurley, University of Florida
Ann Botrom, University of Washington

Instrumentation at the Natural Hazards Engineering Research Infrastructure Natural Hazard Reconnaissance Facility (or “RAPID”) spans a range of accuracy, resolution, and spatial scales to provide data needed for validating various models used in engineering and social sciences. Additionally, the resources are useful for collecting data before, during, and after events, providing data on changes with time. Current reconnaissance efforts are typically limited to ad hoc, uncalibrated measurements (e.g., tape measures and consumer global positioning systems), that are not linked to a common spatial reference frame. This results in significant uncertainty in hazard modeling. For example, various prediction equations for liquefaction-induced settlement and lateral spreading rely on empirical observations and measurements that are often obtained by people who are not adequately trained to measure and acquire geospatial data. This results in erroneous measurements, improper data reduction and, ultimately, a high level of uncertainty in these models. Systematic data collection helps reduce bias and ensures that the data can be queried for additional measurements.