Developing a culture of safety in biomedical research training

 

Perspectives

Developing a culture of safety in biomedical research training

    Published Online:https://doi.org/10.1091/mbc.E20-03-0167

    Abstract

    The National Institute of General Medical Sciences (NIGMS) at the U.S. National Institutes of Health (NIH) is committed to supporting the safety of the nation’s biomedical research and training environments. Institutional training grants affect many trainees and can have a broad influence across their parent institutions, making them good starting points for our initial efforts to promote the development and maintenance of robust cultures of safety at U.S. academic institutions. In this Perspective, we focus on laboratory safety, although many of the strategies we describe for improving laboratory safety are also applicable to other forms of safety including the prevention of harassment, intimidation, and discrimination. We frame the problem of laboratory safety using a number of recent examples of tragic accidents, highlight some of the lessons that have been learned from these and other events, discuss what NIGMS is doing to address problems related to laboratory safety, and outline steps that institutions can take to improve their safety cultures.

    All new funding opportunity announcements (FOAs) for training programs supported by the National Institute of General Medical Sciences (NIGMS) contain the expectation that the programs will promote “inclusive, safe and supportive scientific and training environments.” In this context, the word “safe” refers to several aspects of safety. First, we mean an environment free from harassment and intimidation, in which everyone participating is treated in a respectful and supportive manner, optimized for productive learning and research. We also mean that institutions should ensure that their campuses are as safe as possible so that individuals can focus on their studies and research. Finally, we mean safety in the laboratory and clinical spaces. In this Perspective, we focus on this last issue and describe some of the approaches NIGMS is taking to help the biomedical research community move toward an enhanced culture of safety in which core values and the behaviors of leadership, principal investigators (PIs), research staff, and trainees emphasize safety over competing goals.

    RECENT LABORATORY ACCIDENTS: SOME TRAGIC EXAMPLES

    In January of 2006, a liquid nitrogen tank that had been improperly modified to seal its pressure release valves (presumably because the valves had failed in the past) exploded in the middle of the night in a lab at Texas A&M University (Lowe, 2006Texas State Fire Marshal, 2006Kemsley, 2019). The blast destroyed the lab, showering it with shrapnel and blowing out windows and walls, and creating a large hole in the floor. The force of the explosion propelled the tank like a missile through the ceiling of the lab into the mechanical room above, where it broke through the building’s main water lines. Solvents and other chemicals from the many broken bottles were strewn across the accident site. Because the accident happened in the middle of the night when the lab was empty, no one was hurt, but had anyone been in the lab at the time of the explosion they would probably have been killed.

    In December of 2008, a recent UCLA graduate named Sheharbano “Sheri” Sangji was working in an organic chemistry lab at the university (Benderly, 2018Kemsley, 2018Trager, 2019). She was setting up a chemical reaction using the reagent t-butyllithium, a pyrogen that can spontaneously ignite if it comes into contact with air. As she was transferring the reagent in a 60-ml plastic syringe, the plunger came out and the compound ignited and sprayed onto her. She was not wearing a lab coat and her clothes caught fire. She received extensive third-degree burns and died in the hospital 2 1/2 weeks later. The university and the PI of the lab were charged with felony violations of California law. The accident prompted the formation of the University of California Center for Laboratory Safety at UCLA (Benderly, 2011University of California Los Angeles, 2020b).

    In September of 2009, Malcolm Casadaban, a PI at the University of Chicago, died from an accidental infection with an attenuated laboratory strain of Yersinia pestis, the bacterium that causes plague (Centers for Disease Control, 2011Kaiser, 2011). The strain was compromised in its ability to scavenge iron from its host. However, Casadaban, apparently unbeknownst to him, had hemochromatosis, a condition that leads to heightened levels of iron in the blood. It is speculated that his condition may have allowed the attenuated bacterial strain to grow and become pathogenic. A Centers for Disease Control report on the case (Centers for Disease Control, 2011) concluded that the inconsistent use of gloves by the researcher while handling bacterial samples could have led to “inadvertent transdermal exposure.”

    In April of 2011, Michele Default, an undergraduate at Yale University majoring in physics and astronomy, was working alone at night in a machine shop in the Sterling Chemistry Laboratory (Foderaro, 2011Kemsley, 2011). Her hair became caught in the rapidly spinning shaft of a lathe and she was unable to free herself. She died of asphyxiation.

    In March of 2016, Thea Ekins-Coward, a postdoctoral fellow at the University of Hawaii at Manoa was growing bacteria under atmospheres containing mixtures of hydrogen, oxygen and carbon dioxide gases (Benderly, 2016abKemsley, 2016). Despite a small ignition accident that took place the day before, the research group decided to scale up the experiment. When Ekins-Coward touched the pressure gauge on the ungrounded tank containing the gases, a spark of static electricity was apparently transferred, igniting the gas mixture and causing an explosion that severely injured her and caused serious damage to the lab (Figure 1). Although Ekins-Coward survived, she lost an arm and suffered other significant injuries.

    FIGURE 1:
    FIGURE 1: The aftermath of the University of Hawaii Manoa explosion. The remains of a ruptured steel tank sit at the site of the explosion, near the south wall of the lab. Photo credit: Honolulu Fire Department. Used with permission.

    In addition to these tragic and alarming examples of catastrophic accidents that have happened in U.S. research labs over the past 15 years, many other accidents happen on a more “routine” basis (University of California Center for Laboratory Safety, 2020a). These include: 1) burns and other injuries involving autoclaves, for example, caused by exploding capped bottles or by improperly used or maintained equipment; 2) eye and skin burns from phenol, acids, and bases, or other caustic agents; 3) eye damage from lasers; 4) fires caused by gas burners and ethanol used for sterilization; 5) electrocution from electrophoresis and other electrically-powered equipment; 6) lacerations from razor blades, scalpels, or broken glassware; and 7) needle sticks. Although no comprehensive system exists for reporting the frequency of “routine” and near-miss lab accidents, we urge readers to think back over their careers and reflect on the accidents they have personally witnessed.

    GENERAL THEMES ARISING FROM LAB ACCIDENTS AND NEAR MISSES

    A number of common themes have emerged from retrospective analyses of these and other laboratory accidents. First, serious lab accidents are not restricted to chemistry labs; they can and do occur in any kind of research lab or clinical setting. Second, in almost every major accident investigated, deficiencies in training and/or oversight have been identified as significant factors leading up to the event (Menard and Trant, 2020). Third, while there have been isolated efforts to create a robust institutional culture of safety, particularly after a major accident (U.S. Chemical Safety and Hazard Investigation Board, 2011Hill, 2012Safety Culture Task Force of the ACS Committee on Chemical Safety, 2012APLU Council on Research Task Force on Laboratory Safety, 2016Staehle et al., 2016University of California Center for Laboratory Safety, 2016a, b), there is still an urgent need to be proactive and make these efforts widespread. For training programs, for example, safety should not be thought of as separate from the didactic and mentoring activities provided to the students. Approaches in which safety is only taught by officials in the university’s Environmental Health and Safety (EH&S) office, frequently in the first few weeks the students are on campus, should be replaced with safety instruction that is integrated into the program’s curriculum at multiple points in order to develop trainees’ skills and build on their actual lab experiences. Integrating safety instruction into all aspects of a training program sends a message to students about the importance of safety and will contribute to the development of a culture of safety. Research advisors must assume responsibility and strongly emphasize that safety is a priority (Kemsley, 2013), over and above getting the next big result. Institutional leaders also have an important role to play in prioritizing safety, setting the tone, and creating the expectations and incentives to promote a top-to-bottom culture of safety (Van Noorden, 2011Gibson et al., 2014).

    A fourth general lesson from recent lab accidents is that, by their very nature, academic institutions have frequent turnover of lab personnel and a yearly replenishment of new and inexperienced researchers and students, which leads to a need for robust and repeated safety training and oversight. Despite this need, safety training and oversight in academia generally pales in comparison to industry (Van Noorden, 2011McLeod, 2015Schroder et al., 2016), an environment in which turnover is lower and the average level of experience is higher than in academic labs. Moreover, academic labs have an abundance of researchers in their 20s, an age generally considered to be less risk averse than other life stages (Rolison et al., 2014Rutledge et al., 2016). Additionally, students are not usually covered by workers’ compensation, which can create a variety of financial and legal problems if they are injured in a lab accident. The U.S. Occupational Safety and Health Administration (OSHA) also has limited jurisdiction over accidents involving students because they are not employees, which can complicate the follow-up to an accident and changes the institutional accountability in ways that may not help promote a culture of safety.

    One of the strengths of academic research is the culture of intellectual freedom and boundary-pushing, which can harness individual creativity and passionate hard work to produce extraordinary scientific advances. On the other hand, this culture can also promote unsafe behaviors such as working alone late at night in the lab or sacrificing safety considerations in the interest of getting data faster (Benderly, 20092016c). High-risk, high-reward science should not be unsafe science.

    Finally, funders must take responsibility as well. A number of reports on lab accidents has suggested that funding agencies should increase expectations for safety in the awards they make (U.S. Chemical Safety and Hazard Investigation Board, 2011Benderly, 20142015APLU Council on Research Task Force on Laboratory Safety, 2016). Because NIGMS has the largest training grant portfolio at the National Institutes of Health (NIH) and training grants affect so many students and can have a significant impact on institutional policies and culture, we feel that our training grant portfolio is the best place to start the Institute’s efforts to enhance the culture of safety in academic institutions.

    WHAT IS NIGMS DOING TO ADDRESS LABORATORY SAFETY IN BIOMEDICAL RESEARCH TRAINING?

    As mentioned above, we have been incorporating increased emphasis on safety in the FOAs for all of NIGMS’ training grants. Applicants and reviewers are now asked to address how a program will promote a safe research training environment. In the required institutional letter of support for training grant applications, senior institutional leaders such as provosts or deans must provide information about how the institution ensures that the research and clinical facilities as well as the laboratory and clinical practices promote the safety of the trainees. The peer review panels that evaluate the applications are asked to assess the adequacy of these policies, procedures, and plans. Lab safety is also a topic that can be covered during the teaching of Responsible Conduct of Research (RCR). The application reviewers also evaluate the RCR training plans and how RCR training is incorporated into each program’s curriculum. Moving forward, NIGMS plans to add even more emphasis on safety in training grant FOAs, with a focus on encouraging programs to integrate safety education throughout the didactic and mentored portions of their curricula. The goal of research training is to give students the skills, knowledge, and attitudes needed for them to do the best possible science, and safety cannot be separated from that goal. Planning a good experiment also means planning a safe experiment.

    To help programs meet these new expectations, we have been providing administrative supplements to currently funded training grants to develop new curricular offerings and other activities aimed at enhancing safety training. In 2019, for the first round of these supplements we received very few applications. To help ensure that the community understood the gravity of the issues, we invited Craig Merlic, executive director of the University of California Center for Laboratory Safety, to speak in the summer of 2019 at the biennial meeting of the directors of funded NIGMS training programs. His talk was riveting and received the best reviews of any presentation at the meeting. In 2020, we issued another announcement for lab safety training supplements, along with a companion announcement for supplements to promote safe and inclusive research training environments. We are encouraged by the number of supplement applications and look forward to following the outcomes of these new initiatives.

    As part of our efforts to improve training in the conduct of rigorous and reproducible science, NIGMS has funded grants to allow the development, testing, and dissemination of free, online training modules on topics ranging from statistics to experimental design. These modules are available on a clearinghouse on the NIGMS website. The idea is that they can be used by individuals, labs, educators, or training programs to augment and enhance research training with a goal of improving rigor and reproducibility. We plan to support a similar program to enable development of free, online modules about various aspects of safety and have created another clearinghouse on our website where the community will be able to access these modules as well as links to other useful resources to enhance safety training.

    One key feature we and others would like to see incorporated into teaching about experimental design is the use of hazard assessments and the RAMP principles: Recognize hazards; Assess the risks of hazards; Minimize the risks of hazards; Prepare for emergencies (Howson, 2016). Every time students prepare to do new experimental or preparatory protocols, they should add to their planning processes formal hazard assessments that use the RAMP principles (Figure 2). If their programs teach them how to do this as part of developing their experimental design skills, and their mentors set it as an expectation akin to keeping good lab notebooks or doing the right controls, it will soon become engrained in the institutional culture, resulting in much safer work environments. As part of this process, students should be taught to identify and question their assumptions. For example, in the case of the University of Chicago researcher who was infected by an attenuated strain of Y. pestis, an assumption was that the strain was nonpathogenic because it was compromised in its ability to take up iron. However, the possibility that this iron uptake deficiency could be compensated for if a human host had higher-than-normal blood iron levels was not considered. An example of a more commonplace assumption is that the lab coats we use are fire resistant, an assumption that frequently turns out to be incorrect (Figure 3). As an ancillary benefit, giving the kind of thought required for hazard assessments for new protocols will likely also translate into more thought about other aspects of the protocols, resulting in fewer poorly conceived experiments.

    FIGURE 2:
    FIGURE 2: Steps for conducting a hazard assessment using the RAMP principles, based on suggestions in “Identifying and Evaluating Hazards in Research Laboratories: Guidelines Developed by the Hazard Identification and Evaluation Task Force of the American Chemical Society’s Committee on Chemical Safety” (Hazard Identification and Evaluation Task Force of the American Chemical Society’s Committee on Chemical Safety, 2015).
    FIGURE 3:
    FIGURE 3: Many traditional lab coats are made of cotton or polyester and are neither flame nor solvent resistant. Students here are shown wearing next-generation lab coats that are flame and solvent resistant (Saner, 2017). In addition, they have elasticized cuffs that reduce the likelihood of a sleeve catching on equipment or inadvertently being dragged through the flame of a Bunsen burner. Lab coats, safety glasses, and other types of personal protective equipment should be high quality, in good repair, as comfortable as possible, and available for everyone in the lab and in the correct sizes. Photo credit: Penny Jennings, UCLA. Used with permission.

    We hope that institutions will embrace these opportunities and explore new ways to help young researchers—and faculty—develop improved skills, knowledge, and attitudes and create cultures of safety at their institutions.

    CREATING CULTURES OF LAB SAFETY AT ACADEMIC INSTITUTIONS

    Weak laboratory safety cultures have been repeatedly identified as significant contributors to accidents (U.S. Chemical Safety and Hazard Investigation Board, 2011Hill, 2012Safety Culture Task Force of the ACS Committee on Chemical Safety, 2012National Research Council, 2014Staehle et al., 2016University of California Center for Laboratory Safety, 2016ab). What can institutions themselves do to transform an inadequate safety culture into a culture of safety? Refocusing an institution on safety—or any other priority—can cascade from the top (Van Noorden, 2011Gibson et al., 2014). Leadership, from presidents, provosts, and deans to department chairs and training program directors, should make clear—repeatedly and frequently—that safety is a key priority. Within a lab, PIs and senior lab personnel should follow suit. Hearing the message from those above you in the chain of command will promote thinking about safety, whereas not hearing the message will leave PIs and students with the impression that safety is not a priority. In addition to articulating a clear and consistent message about safety, institutional leaders should consider how to incentivize and reward good practices (National Research Council, 2014). Institutions and professional societies might consider giving awards for innovations in lab safety or the promotion of particularly effective strategies to reduce accidents. Universities might also explore incorporating safety records and achievements into promotion and tenure considerations.

    In thinking about how to incentivize improvements in institutional safety culture, it will be important not to penalize reporting of accidents or near misses. In particular, near misses serve as sentinel events that can prevent a more serious accident from occurring. For example, in the University of Hawaii explosion, had the earlier small-scale ignition been immediately reported to safety officials, it could have led to the cancellation of the scaled-up experiment (University of California Center for Laboratory Safety, 2016ab). Bystanders should also be empowered and rewarded for reporting dangerous situations or near-miss events as they may see risks that those doing the experiments don’t recognize. Part of an effective culture of safety requires that people look out for one another. In this regard, universities might leverage or emulate systems already in place in most academic medical centers for reporting safety concerns. It is also important to promote a culture that doesn’t look to blame the victim (or near victim) but instead focuses on learning lessons and implementing improvements based on accidents or near misses (Menard and Trant, 2020). Blaming the victim rather than critically evaluating the institutional policies, procedures, and infrastructure that enabled the accident will disincentivize reporting and will not lead to improvements in the safety culture.

    At most universities, the bulk of the thinking about lab safety is done by the staff of the EH&S office or its equivalent. These people are usually highly trained and are dedicated to keeping bad things from happening so that important research can progress as efficiently as possible. It is important for trainees to learn not to view EH&S staff as adversaries but to instead consider them as allies (National Research Council, 2014). Again, trainee perception tends to cascade from the top. If the lab’s PI doesn’t have a positive relationship with EH&S staff, trainees tend to learn from that example. In designing new curricular elements to develop students’ lab safety skills, knowledge, and attitudes, it would seem wise to engage the local EH&S staff. Integrating them into the teaching teams might also be a useful approach to explore because seeing faculty members teach together with the EH&S staff will send a strong signal about partnership between researchers and EH&S to the students.

    Finally, if academic institutions worked to emulate the safety practices of industry, it would reduce accidents and help develop a more robust culture of safety. For a variety of reasons, including differences in regulatory oversight, staff experience levels, and financial risks and incentives, industry has a significantly more advanced culture of safety than exists in academia (Van Noorden, 2011McLeod, 2015Schroder et al., 2016). If academic training programs were to teach laboratory safety to industry standards, it would have several benefits. First, academic research labs would become safer. Although top-down reinforcement by institutional leadership and PIs would still be essential, having cadres of students entering labs with better safety skills and attitudes would increase everyone’s level of attention to safety. Second, the graduates of the training program would require less additional training, making them more attractive new hires in industry and elsewhere. Similarly, teaching to industry expectations in areas beyond safety could also have multiple benefits. For example, teaching students to keep records using industry standards would enhance experimental reproducibility and transparency, help the university’s technology transfer office in cases where intellectual property protection is possible, and better prepare students for careers in industry and government labs.

    LOOKING TOWARD A SAFER BIOMEDICAL RESEARCH ENTERPRISE

    Many of the lessons learned from lab accidents and approaches to improve lab safety training that we discuss here (Figure 4) are also applicable to safety from harassment and intimidation. For example, the importance of leadership, proper training and oversight, incentivizing and rewarding good practices and reporting, teaching to higher standards, and the role of funding agencies are all applicable to creating a culture of safety, broadly defined. There are already a number of energetic, knowledgeable, and persuasive champions for improving the culture of safety in academia. One feature common to many of these champions is that they or someone close to them was involved in an adverse incident—whether it was a lab accident or harassment—that galvanized them to take action. We need more people to engage in these efforts before an incident happens. In all areas of safety, we need more champions.

    FIGURE 4:
    FIGURE 4: Summary of the main themes of this Perspective showing ways in which a culture of laboratory safety can be cultivated at academic institutions and incorporated into research training programs. Goals for academic institutions are shown on the left (blue and orange) and steps NIGMS is taking to help them achieve these goals are shown on the right (green).

    FOOTNOTES

    Abbreviations used
    EH&S

    Environmental Health and Safety

    FOA

    funding opportunity announcement

    NIH

    National Institutes of Health

    NIGMS

    National Institute of General Medical Sciences

    PI

    principal investigator

    RAMP

    Recognize hazards; Assess the risks of hazards, Minimize the risks of hazards, Prepare for emergencies

    RCR

    Responsible Conduct of Research.

    ACKNOWLEDGMENTS

    The authors thank Judith Greenberg, Peter Espenshade, and Guy Padbury for helpful comments on the manuscript and the National Advisory General Medical Sciences Council and our colleagues at NIGMS for useful discussions. We are grateful to Erika Shugart for suggesting we write this Perspective.

    Comments

    Popular posts from this blog

    Respiratory Viral Season: Fall 2023 Edition - Dr. Dora Anne Mills

    36 Best New Year's Traditions to Ring in 2024

    New MHIR staff introduction - Melissa Sassman