In our last episode of Stereo Chemistry, we talked to chemists who survived accidents at the bench and learned what went wrong and what lessons they could share to improve lab safety. In this episode, we’re looking at what it takes to build a culture of safety. That is, what can organizations do to let researchers know that their safety is not only valued but expected? Hosts Jyllian Kemsley and Matt Davenport talk to experts about the importance of leadership, commitment, and education to transform lab safety from an exercise in compliance to a core element of the central science. Listen to the episode now at cenm.ag/cultureofsafety.
Matt Davenport: Hey everyone. This is Matt Davenport, and you are listening to C&EN’s Stereo Chemistry. This episode is all about safety culture in research science. And it’s actually part 2 of a two-part lab safety series we’re doing.
If you haven’t heard part 1 yet, I recommend giving it a listen first. It’s not absolutely essential—everything in this episode should make sense without hearing our previous show. But I do think it provides some context that could make this episode a little more meaningful for you. A quick word of warning, though: that first part does describe a couple lab accidents with some graphic details. So please take that into consideration while deciding on whether to give it a listen.
With that said, we’re just going to jump right into it this week. Joining me again is C&EN’s safety guru, Jyllian Kemsley, from her office in beautiful Northern California. As we got ready to make this podcast, I was talking a lot about safety culture, and before we started recording, Jyllian pointed out that there’s a difference between safety culture and a culture of safety. So why don’t we start with that distinction. What is it?
Jyllian Kemsley: So everywhere has a safety culture. It may be a good safety culture. It may be a bad safety culture. But everywhere has some sort of safety culture already.
So if we back out of laboratory safety, so say your workplace has a rule against holding business calls on a cell phone while you are driving. For employee safety. We do not want you to do this. So the rule is there. Now a place with, say, a poor safety culture, everyone would ignore that. A place that has perhaps a good safety culture, people would pay attention to the rule. You wouldn’t have managers talking to subordinates while the subordinates know that they’re driving or stuff like that. So that would be, that’s safety culture. It can be bad, can be good, can be somewhere in the middle.
A culture of safety to me—and I think to most people—means a culture of people working with safety in mind and thinking about safety as part of what they are doing. And that would generally correlate with what we might think of as a good safety culture. And I contrast that with culture of compliance, which is more of a checking the boxes. Do we have the rule? Yes. Have we told everyone about the rule? Yes. Have we issued everyone’s safety glasses? Yes. But that doesn’t necessarily mean people are working safely. So you might think of a culture of compliance as a poor safety culture, and a culture of safety, meaning people working safely and thinking about safety as they’re planning their work, would be more of a good safety culture.
Matt: Have you thought about what are the hallmarks of a good safety culture or a culture of safety?
Jyllian: It’s a little nebulous. I do think it to some degree at least comes down to that distinction between culture of compliance and culture of safety, where you’re not just following rules and checking boxes. You’ve got people thinking about it as part of their experimental planning and valuing themselves and their colleagues.
Matt: And I like what Jyllian said about this being a little nebulous. As we started working on this episode, I saw a tweet from Alex Goldberg, who is a chemist at Gilead Sciences and, dare I say, a Stereo Chemistry superfan. But the tweet said he knows what a bad safety culture looks like. He knows what a good safety culture looks like. But how do you build a good safety culture?
The bad news is that I don’t think that we have any simple answers in this podcast. Hopefully, that’s not too much of a surprise, though. I mean, changing a culture is hard. It involves people, who, at least in my experience, can be stubborn creatures. And to complicate things further, safety is a dynamic thing. It’s not like you can take a training course that makes you safe forever, right? Different situations have different hazards that you need to assess in order to approach them safely.
In this episode of Stereo Chemistry, we’re going to talk with chemists and safety experts about what it takes to make changes in how people think about safety and what more you need to make a culture of safety. And we’re going to share a few resources that we hope you’ll find helpful in your efforts.
So let’s start at the very top. Like, what are the elements of a good safety culture? Who are the players involved? Mary Beth Koza, the executive director of environmental health and safety at the University of North Carolina, Chapel Hill, had a really good answer for that.
Mary Beth Koza: I define safety culture as an equation. Leadership plus organizational design plus empowerment of the individual. So you need a commitment from leadership. And then under organizational design is to make sure you have budget and structure. But then we need every person on the campus to have an ownership of this commitment to safety.
However we change the culture, it starts at the top. Culture always starts at the top, but it also starts from the bottom, too.
Matt: And one of the interesting questions to me thinking about leadership is how does the American Chemical Society fit into shaping the culture of safety in chemistry? So first, I’ll remind you that ACS publishes C&EN and, thus, this podcast. So ACS is my employer.
Secondly, it’s a massive scientific society with over 150,000 members. If you’ll permit me an understatement, a lot of those members care about safety. The society also has volunteers that do things like coordinate symposia about safety at conferences and decide what the society says about safety and how it says it at an organizational level. We also have a full-time employee devoted to safety, Marta Gmurczyk.
Marta was actually on an ACS Safety Culture Task Force. In 2012, that task force published a report that outlined seven key considerations integral to a strong safety culture: leadership, education, attitude, sharing lessons learned, collaboration, communication, and funding.
Marta Gmurczyk: But in my personal opinion, there are two that are the most important and these are leadership and education.
Matt: That is Marta.
Marta Gmurczyk: The direction and strength for safety cultures in any organization is set by its leaders, because these are the leaders that inspire others to value safety.
But laboratory safety also involves the development of very specialized knowledge and skills, and this must be an integral part of the chemistry curriculum. So traditionally, we have been teaching students what to do or what not to do in order to be safe in the lab. So, for example, wear safety goggles, don’t wear open-toed shoes, tie long hair, and so on. And these are rules, important rules. I don’t think rules influence the change.
And this is what safety used to be for me, but safety is so much more. It is understanding hazards, being able to find information about hazards. Also, understanding the risk from the hazards and being able to critically think about ways to minimize this risk.
Matt: The goal of that education is then, in essence, moving away from a culture of compliance. It’s moving away from safety as a set of rules and treating it as a way of thinking about what you do.
Marta Gmurczyk: When you think about science, right, we always think about scientific method, how you build a new scientific discovery. But if you think about the lab processes, to me, after working all this time with chemical safety, it is so obvious that before you do something in the lab, you should have hazard assessment. We know chemicals do have these intrinsic hazards. We know there is a certain danger related to working with chemicals. So it should be so obvious that before you design any process, you should think about these hazards, you should know how to find information about the hazards. Then you should know how to assess the risks, right?
Matt: So I want to kick it back to Jyllian here. Do you remember your safety training in grad school? Did you have one?
Jyllian: We did have safety training. I don’t remember most of it. The only thing I remember and so the only reason I can say that I know we had safety training was actually they took us out to the parking lot and let us set off fire extinguishers. So that’s the only thing I remember.
A decade or so later I was at a friend’s house and her toaster oven caught fire and she had a fire extinguisher and I think there were five of us in the kitchen. And everyone was kind of standing around looking at each other like, “OK. There’s the fire extinguisher. What do we do?” And I was the one who took it and pulled the pin and actually put it out, and I think part of that was because I had done it before.
Matt: So that is also the only thing I remember from my safety training at grad school, going out behind our building and having the opportunity to actually use a fire extinguisher. I think there is something to be said for, probably everyone in that room with you knew academically how to use the fire extinguisher. But having done it before and knowing it’s a thing you can do to put out a fire.
Jyllian: It’s muscle memory to some degree. You’re less freaked out by it, having done it before. I think that goes to an important part or maybe what should be an important part of laboratory operations. Similar to we all have to do fire drills, for example. And part of the reason for doing that is so that people have walked through it and know exactly what to do if there is a real fire: what they need to shut down, how do they get out of the building, and they have that muscle memory, so it becomes routine. For people working in a laboratory where there might be, where there’s enough of a hazard that you have to respond to it, to actually walk through what your response would be to that, so the people do have the muscle memory.
Matt: Many of these different elements of education and training that we’ve been talking about have been formalized in an undergraduate safety education textbook, which we’re going to talk about in a second with Ralph Stuart. He’s the environmental safety manager at Keene State College, a primarily undergraduate school in New Hampshire. He’s also the chair of the ACS Committee on Chemical Safety.
In our last episode, Jyllian and I talked about the Sheri Sangji case and how Jyllian hoped it was a watershed moment in laboratory safety. And I think everyone that I asked about it agreed that it was—that since Sheri Sangji’s death, the way we talk about and practice safety in the lab has improved overall. And Ralph says there are a couple of other important factors to consider as to why that is.
Ralph Stuart: As I’ve been exploring the history of chemical safety, laboratory safety, I’ve seen a lot of articles from the 1920s and 1930s where faculty were saying that their students were not learning enough about chemical safety and employers were saying the same thing. There were deaths in the laboratories. Marie Curie is the most famous one. She not only died from her science, her daughter died from her science. Many of her lab techs died from their science. So we have a long history of struggling with laboratory health and safety within the chemistry discipline. So the question is why did the Sheri Sangji death accelerate concerns and study in this? And I think the primary reason that it did is that there were two technical events that happened about the same time.
Matt: One was the Globally Harmonized System of Classification and Labelling of Chemicals, or GHS. This system standardized the way people talk about the hazards associated with a chemical, breaking them down into health hazards, physical hazards, and environmental hazards.
The GHS is also responsible for the standard hazard pictograms that you probably see all the time on chemical labels, data sheets, and storage areas. You know what I’m talking about. The red diamonds that will have like a little black flame inside to tell you that something is flammable. Or the corrosion one is the one that I always remember. It’s the drops coming out of a test tube, eating away at a little slab on one side and a hand on the other.
All of that comes from the GHS, which was developed by the United Nations and adopted by the US Occupational Safety and Health Administration in 2012.
Ralph Stuart: That has given us an almost what I would call periodic tables of chemical hazards. When the globally harmonized system came out, it gave us a way of organizing humongous amounts of data into a searchable and actionable collection of information, and that’s essentially what the periodic table does as well.
Matt: The second development Ralph credits with the sustained improvements to safety culture comes from that undergraduate textbook we mentioned.
Ralph Stuart; So in 2010 there was a textbook written by Dave Finster and Bob Hill that outlined the RAMP approach to laboratory safety.
Matt:RAMP is an acronym and mnemonic device. You’ve got R for recognize the hazards, A for assess the risks, M for minimize the risks, and P, prepare for emergencies.
Ralph Stuart: The really important thing about it is that it separates the recognize step from the assessment step. So in the typical intuitive approach that we used before that, if you saw you had benzene, then you assumed, “Oh my god, you have a carcinogen and you have a flammable chemical.” But just the fact that you have 1 mL of benzene in the fume hood does not present either a fire hazard or a toxicity hazard. So you have to not only identify the name of the chemical to understand the risk; you have to assess the situation. Is 1 mL of benzene in a fume hood a hazard? Very unlikely that you come up with a scenario where it would be. Now if you’re working with a gallon of benzene in a fume hood or a gallon or half a gallon of benzene on the benchtop, yeah, you’d have a very different scenario. So by giving us a way to separate recognizing a hazard from assessing the risk, the RAMP really avoids conflating different steps in the process, and it gets us away from sort of the danger culture approach to you can actually work with a chemical in a safe way if you think it through.
Matt: Again, the idea is to teach students that safety is an exercise in critical thinking. And that this exercise is a core part of doing chemistry. But you need an entire organization, not just students, to buy into that philosophy.
Ralph Stuart: How you really develop a culture of safety is by connecting the safety work to the mission. A lot of times, safety is presented as something to distract you from the scientific mission or the technical mission. But I think when people recognize that it really enhances your science, enhances your work, and helps you get your work done more quickly and more effectively, then the safety work that we’re asking them to do really adds value to their job. And that’s when it really becomes part of the culture.
Matt: We will have a collection of links that we share with this podcast to point you to all the safety resources that we talk about. And we are going to talk about even more of them later on. But before that, I also had a chance to talk to Deborah Davis about the importance of training. She’s currently in marketing development at ExxonMobil, but she came up through the company as a polymer chemist, and she still is a safety advocate. Based on our conversation, it kind of sounds like they are all over building that mind-set we heard about from Ralph. Debi was very clear with me that, at ExxonMobil, it’s not just about results.
Deborah Davis: How you get the results is just as important. We’re not going to do it at the risk of safety. And there’s a lot of policies and procedures and structure around that that helps them get through that. One of the statements that we have is “Everybody’s a safety leader.” Everybody. Anybody is empowered to stop any experiment at any time. That’s absolutely the truth. So that’s where they will struggle sometimes when somebody will interrupt their work and say, “Did you do this?” or “Did you do that? Did you think about that?” or “That doesn’t look safe with me.”
So when somebody gives you a feedback that says I don’t think what you’re doing is safe. If they come back and they receive that feedback very negatively and get very defensive, it’s going to be difficult for them until they learn that, “OK, I do have to get better at this. I do have to receive that feedback and that they took the time to care enough to stop what I’m doing.” So that should be a “thank you” and not necessarily get defensive and get upset and angry. They’re actually trying to help you. So we do see some struggles in that area. And I’m sure every other industrial company sees the same thing.
We absolutely need results, and we will get results, but we will get it safely. Because one incident can shut that project down if you blow up a lab or a hood or somebody in that area gets hurt or there’s a release. Everything shuts down until they figure out why it happened. How can we prevent it? I promise you you’re going to be much faster at preventing those things than you are at cleaning it up.
Matt: When we’re thinking about safety culture, Jyllian, what to you is the biggest difference between industry and academia?
Jyllian: I think the biggest difference is that academia has, on average, a much younger workforce and much higher turnover than industry does. And 20-year-olds think they’re invincible. I certainly did. I’m sure you did. I’m sure everyone we knew in our 20s also felt they were invincible, and I have a much better appreciation and awareness for my own health and safety now than I did in my 20s. So I think there’s a different perspective in the industry workforce because you have a proportionately older and more experienced workforce.
In terms of saying to people, “If we can’t do this safely it’s not worth doing,” I don’t see any reason that academic researchers, anyone in academia—whether it’s your provost or your dean or your department chair or your PI—I don’t see any reason that they could not say that and make it clear that that is how people would operate. Now whether they choose to do it is a different matter, but I think in principle, we would all hopefully agree that is no research result that is more important than someone’s fingers or arm or eyesight or their life.
Matt: With that in mind, we wanted to talk to somebody about changing safety culture in an academic system where they did have that tragedy, to hear how it was done. Which brings us to Debbie Decker.
Debbie is the safety program manager for the Chemistry Department at UC Davis. And she was there when the entire UC system was legally compelled to bring its labs up to California labor codes. This was part of the 2012 settlement the UC regents made with the Los Angeles district attorney’s office in the criminal case following Sheri Sangji’s death. I asked Debbie to describe what that was like.
Debbie Decker: Oh, it was just abject fear. I mean really. Terror will cause people to become laser focused on things. And I had a wonderful supervisor, the chief administrative officer here in the department. And a lot of support from the senior faculty who I had worked with over years and in my role at EH&S. And with the department chairs, the department chair changed in the middle of this, both department chairs were very supportive and just, “Whatever you need, Debbie. Whatever you need. Tell us and we’ll make sure that the faculty come along.”
And it was lots of communication. Lots of communication. And I think being a woman helped, being someone who isn’t authoritarian. If I have to be I can be, but that’s not a comfortable role for me. I just came out collaborative because that’s just so fundamental to who I am. And because I had worked with senior faculty in other contexts, that helped tremendously in communicating, “OK gang, this is what we got to do. You can, you can go squawk at the regents if you want, but here it is, and we’ll try to make this as painless as possible, but this is what we have to do.”
Matt: For those of you who listened to our last episode, you might remember that I teased this episode with Debbie saying that the ACS could be doing more on the safety front. I was being a little sneaky. Don’t get me wrong. Debbie is a fellow of the ACS who has been volunteering her time and effort for safety issues for a long time. She does think the society could do more, especially in terms of having safety information discussed and shared by more divisions within the ACS.
But here’s where I was a little sneaky. She’s also really proud of what the society has done to push safety culture in a positive direction. And one of the biggest changes sounds like a straightforward one to make.
Here’s the quote from Debbie that I used to tease this episode, but I’ll play you the whole thing this time.
Matt (in interview): What can or should this massive scientific society do to help further these efforts?
Debbie Decker: More. More. One of the highlights of my career as a volunteer in the ACS was that, I’ll get emotional about this, Matt, because it was huge. I can’t tell you how huge it was that after 100, 150 years, safety and ethics are the core values of the society. That’s huge. You can say, “Well, of course, safety and ethics.” No, no, no. You’ve got to say it. You’ve got to say it. And that’s how I feel we honor the memory of those people who have, who have died on the altar of discovery research. We honor Sheri Sangji when we make safety a core value of the society. We honor their memory.
Matt (in studio): We’re going to take a quick break. When we come back, we’ll be talking more about those safety resources that we promised earlier. Stick around.
If you are a discerning consumer of chemistry-themed multimedia—and I know that you are, you little Stereo Chemistry listener, you—you should really check out ACS Webinars. They have tons of free webinars on important and interesting topics. For instance, folks listening to this particular episode of Stereo Chemistry may want to register for the webinar on Aug. 15 at 2 p.m. called “Lab Safety for Researchers: Responsibilities, Regulations, and Lessons Learned.” And they’ve got lots of other great programming, including a Spanish-language broadcast coming up on Aug. 22 about biomorphs, or minerals with a decidedly biological look. So imagine rocks that look like corals or conch shells. Pretty gnarly stuff.
We love working with ACS Webinars. We look forward to teaming up with them each September for our #ChemNobel predictions webinar, and basically any other chance we get. We think you’ll love their programming, too, so if you haven’t already, be sure to check them out. I’d give you the web address, but it’s honestly just easier to type ACS Webinars into Google or your preferred search engine. We’ll also put a link for you in this episode’s transcript on our website at cen.acs.org. Thanks.
We spent the first part of this episode talking about leadership, education, and training. All of which are needed to move from a culture of compliance to a culture of safety. But we haven’t touched on another really important element, which is empowerment.
What do you do when you have safety information that you want to share? Where do you go to when you have a question that needs to be answered? If you don’t know, it’s super easy to do nothing or even convince yourself that maybe it’s not worth speaking up, that your voice doesn’t need to be heard, or at least not in that moment. But I would contend that that attitude does not jibe with a culture of safety.
So in the second part of this podcast, we’re going to talk about some of the resources where you can go for answers, share your ideas, or even just talk with other like-minded people.
And I want to start this discussion with Jyllian. In our last episode, you shared a story about seeing a liquid-nitrogen dewar being filled with its safety valve closed and not saying anything about that because, in part, you didn’t know who to say something to. Looking back now, can you think of people you could have talked to?
Jyllian: I think the person I would go to, there was someone who I think played a role that was kind of safety officer and kind of a chemistry department facilities manager. That person was probably the appropriate person to go to.
But the other part of that is not just knowing who to go to but also to feel like that would be welcome information to the department. So part of making clear to people who to go to, I think, is also making clear that it’s OK to go to them.
Matt: I’ve been fortunate in that I did research as an undergrad and a grad student, and in university settings and then also at a national lab. I think through that, though, I’ve worked in both cultures of safety and cultures of compliance. And thinking back to a culture of compliance, one thing I remember is feeling unsure about talking to safety officers when I’d see something unsafe or not knowing how to do something safely. I’d start wondering, Am I going to get in trouble? Or is my adviser going to get in trouble? And so I think it’s a big deal knowing that you can trust the person you’re talking to.
So if you are a safety professional, how do you build up that trust? That’s what I asked Kim Jeskie of Oak Ridge National Lab about her experiences over the years.
Kim Jeskie: What I might have expected on a daily basis would be someone to come and say, “Hey, I’m planning this new research project or this new experiment. And I’m trying to find information on this component or a system setup—that kind of stuff, anything in general—and I’m not finding anything. Can you help me?” You basically build trust with the individuals so that they know that they can come to you for solutions and not as an individual who may be telling them, “No, you can’t do that.” But, “Hey, here’s how you can do it and let’s . . . let’s work together to find a way that you can do it more safely.”
Matt: Kim is the director of what’s called the Integrated Operations Support Division at Oak Ridge. And that may sound like a vague, new-speaky name for a division, until you learn that the division does just about whatever it can to support the lab, including instrumentation installation, repair, and calibration, along with overseeing hazard identification and evaluation. Before getting into this role, Kim had over 20 years’ experience as a researcher in chemistry and the physical sciences. I was curious why she made the change, and her answer surprised me a little.
Kim Jeskie: The biggest motivator for me is that our research community pursues research for the joy, for the progression of chemistry as a chemical enterprise. And it’s not possible nor desirable to make all of our research community or even those people who are in industry, it’s not possible or desirable to make all of them experts in chemical safety.
Matt: This is what surprised me, because it’s really easy for me to say, “Nu uh. We should all be safety experts. That would be excellent.” But Kim explained that she sees safety as its own subdiscipline of science. In that light, the “everyone becomes an expert in safety” proposal doesn’t look so practical.
For example, can you think of a place where every materials researcher has their own clean room? Let me know if you can because I absolutely want to see it. But that doesn’t really make sense compared to having a shared facility with experts trained in specific tools that can help researchers do what they want to do. And, of course, you need to be able to understand and communicate with those experts, but you don’t need to be those experts. It’s much the same with safety.
Kim Jeskie: So to me what hit me early on in the early ’90s was that there was actually an area of support where folks like me could learn more about the chemical safety side of things and enable them and their research, to do their research more safely.
Matt: Now Kim and her team are helping researchers plan the safety side of their operations, and those researchers know and trust that she’s there as a resource.
Kim Jeskie: People just simply don’t know what they don’t know. And empowering them with basic principles that they can expand on—for example, not trying to understand the individual hazards of every material that they work with but understanding the process that you go to to find that information and use that information—is an incredibly powerful tool for everyone that’s out there. That puts them in a much better situation of doing what they love and doing it safely.
Matt: And there may be other options for people to talk to at your institution if you, say, don’t feel comfortable going directly to a safety officer. In reporting this story, I learned about the existence of what are called laboratory safety teams, which can be made up of a diverse group of researchers, including students. I’m going to let Kali Miller explain what those are. She was on one of those teams at the University of Illinois, Urbana-Champaign, as a PhD student.
She had actually just graduated when we spoke, and now she’s working at ACS as a development editor for journals. Her portfolio includes the forthcoming ACS Journal of Chemical Health and Safety.
Kali Miller: So laboratory safety teams, sometimes called joint safety teams, have emerged all across the US as a supplementary approach to improve safety culture in academic institutions. So they’re kind of like collaborative groups of students comprised of laboratory safety officers, faculty, staff, those kinds of people. They’ll meet on a regular basis to discuss safety, any kind of safety issues. The University of Illinois joint safety team was established in 2016 by our department head. So the main focuses of our group have been near-miss reporting, monthly safety officer meetings, supplemental TA training, and things like that.
Matt: Kali told me the first such team was founded in 2012 at the University of Minnesota, with help from Dow. The Illinois safety team actually used that first one as a model for starting their group. And the way I’ve described it probably makes it sound like it was clear sailing, but it wasn’t, especially when it came to near-miss reporting.
Kali Miller: We had a hard time launching it because, you know, the PIs want their students to be safe, but they also don’t want to get themselves in trouble. It’s like, “Oh, the students are tattling on the PIs,” and then the other professors are going to know which labs are unsafe just because of the amount of near misses that are submitted. So ours is confidential, and we try and keep it so that you don’t really know what labs. And so we try and be selective about what we present.
That kind of discussion about it, I think, really goes a long way, even if not everything is reported. It just starts creating this culture of people that like to talk about safety and think about safety more.
Matt: So safety professionals and safety teams are a couple in-person resources you might have available to you. But there are also a lot available on the internet. We’re going to talk about a select few, but this list is by no means exhaustive. So please let us know about more on Twitter, or email us at email@example.com.
And before we get into web resources, Jyllian has some sage advice for you.
Jyllian: There’s all sorts of possible tools. Not everything has all the information. I think possibly the closest you might come to that is PubChem, which has safety information. In particular with PubChem, one of the things I really appreciate is that it tracks back where it got that information from, so you can see the source and evaluate how applicable what that particular source did is to your situation.
I think the really important thing in all of this is to evaluate the source of your information. Would I rely on Wikipedia for safety information, for example? Wikipedia is great on a lot of science stuff, but I would not trust my life to the safety information in there. I think this is an area where that literacy component becomes really important. What is this information? Where is it coming from? How much can I rely on it?
Matt: The first website we’re going to talk about is called the Safety Net, which is a website that collects what are called SOPs and other safety resources. I’ll let Alex Miller give you the rundown. He’s a synthetic chemist at the University of North Carolina, Chapel Hill, and he launched the Safety Net with his friend and colleague Ian Tonks of the University of Minnesota.
Alex Miller: The Safety Net is a web resource that collects safe operating procedures, or
SOPs, and other signage and resources for primarily academic synthetic chemists—organic and organometallic and inorganic chemists. And it got started a few years ago when Ian and I were both starting our independent careers and talking about how to set a good tone of a strong safety culture in our labs. And we talked a lot about how we do the same things slightly differently in different labs and that we don’t always talk about it. We wanted to build a platform where we could share what it is we do and how we do it our own particular ways and let the community see that and think about how they do it.
Matt: We spoke with Ian in our last episode about an experience he had as a postdoc where a compound literally blew up in his face. After the accident, Ian says he’d talk to people who would say things like, “We were working with the same compound and it detonated on us, too.” It frustrated him that wasn’t something he heard before the accident.
With the Safety Net, the hope is that chemists can more easily share how they do certain things safely and get that information out there. And that the collected knowledge can actually help accelerate research as well as improve safety.
Ian Tonks: When you are starting up a lab, you’re doing the same thing that a lot of people are doing across the country. And I think that having a collective resource in place is a great opportunity for us to combine our forces and actually spend less time repeating the same thing over and over again while also potentially learning different ways to do things.
Matt: The first SOP was submitted to the Safety Net in 2016, and the most recent one was posted earlier this year.
Ian Tonks: So we’re trying to see how this is going to evolve over time. I think that what we’re really looking forward to is getting more contributions from the community. We’ve had a lot of conversations with people off-line that have been really helpful for us changing the way we do our chemistry and learning new things. And so the more people that are willing to contribute, I think the better that discussion is going to get.
Alex Miller: Yeah. Totally. And to me the takeaway from all this is to talk about it, and that includes with your lab mates, with your EHS people, with colleagues from industry, and from other universities. And from all that, a lot of good habits can form, and a good safety culture can form.
Matt: The next safety resource we want to talk about is called Not Voodoo, a website I actually learned about from Alex and Ian. Not Voodoo is also fueled, in part, by contributions from the community. But I was surprised to learn that the site launched in 2004, before such things were common on the internet.
Alison Frontier: At that time, there weren’t online resources, but it was so obvious to see how one would be useful. Google didn’t really work either at that time. So I couldn’t even search and find stuff. And so it was easy to write a proposal to say, look, you need one place where you can go and find all of these things that you normally walk around to your neighbors in a big lab and ask them. “What do you know about this and what do you know about this?” And if you’re in a big lab, you can usually find the information you want, but it’s not in a textbook. It’s not written down anywhere. And it’s sort of just things people know and so would really be nice to collect all that somewhere.
Matt: That is Alison Frontier, a synthetic organic chemist at the University of Rochester. She also started Not Voodoo, but she tries to keep a low profile with regard to her connection to the site. She’d rather be known for her original research. The site, she says, is more of a collection of wisdom that’s been available, just never in the same place at once.
Alison Frontier: It’s really built for students who are just starting full-time lab work independently. Often you’re just thrown in, you’re given a bench and a hood and a mentor and a project. But you have to figure out a lot of things on your own. So what I tell my students when we do safety training is you get three opinions. Like one can be on the internet. One should be an actual person near you, and then a third one that is somebody else who’s some kind of authority.
Matt: There is a lot of information on the site, but my favorite items are collected under the tab labeled “Rookie Mistakes.” It is an amazing collection of lab misadventures, from the truly uncanny to the more mundane things like stabbing yourself with a syringe or burning yourself on a hot plate.
Alison Frontier: So that has been something that I just never anticipated that had the impact on people, that I wouldn’t have guessed. And maybe nobody would have guessed, actually.
So what would happen is I would go to a university and I would give a talk. And then I would meet the students who actually used it and knew about it, and if they did manage to connect me with it they would say those rookie mistakes are a lifesaver from having a bad day. I think that people can go and look and say, “Oh, it’s not just me. People make mistakes.”
Matt: The last resource we’re going to talk about is the Dow Lab Safety Academy, which Dow launched in 2013. Since then, it’s had over 75,000 people enroll from hundreds of universities, companies, and government agencies, according to numbers Dow shared with us. It’s been accessed in 150 countries. Mark Jones, a strategist and communications fellow for Dow Chemical, tells us that the academy grew out of Dow’s collaborations with universities. As Dow was sharing safety information with those collaborators, the company realized they could share fundamental information more widely using the internet.
Mark Jones: Much of this is not that kind of esoteric rocket sciencey stuff. It’s like, How do you use a fume hood correctly? Or how do you use gases correctly? Are you putting your chains around your cylinders? How do you handle vacuum equipment? What’s the best personal protective equipment for different applications? These are broader categories, simpler concepts to get across.
Matt: So there’s an obvious question here, and it also applies to the Safety Net and Not Voodoo. But it just feels like it looms largest for a company like Dow. And that question is, If you’re putting safety information out there, aren’t you afraid that somebody’s going to sue you if something goes wrong?
Mark Jones: But I think the thing that is always a challenge in the big corporate environment is we do have lawyers, and we are in a litigious society. And it is always dangerous to go out and tell anyone you can do something safely. I think the thing that I must give a lot of credit to our executives for actually shutting down that noise and saying this is really the right thing to do. The universities are still having still fires and crazy events that we don’t have in industry. We must share with them. We must make the chemical enterprise better.
Matt: At the top of the episode, I said something about not having simple answers about how to change safety culture. And maybe that wasn’t precise enough. Through all of these interviews, the one thing that sticks with me most is that you need to talk about safety. And that seems pretty simple. Maybe not easy, but simple.
You need the individual chemists—the students, the postdocs, the principal investigators—talking about safety and sharing what they know. You need to have safety professionals that you trust to be part of those discussions. You also need universities, chemical corporations, and massive scientific societies being vocal and setting the tone when it comes to safety. Change starts at the top, and change also starts at the bottom.
With that, we’ve come full circle. The decent thing to do would be to end this episode and let you get on with your lives. So that’s what we’ll do, but we are by no means done with this discussion because we know you aren’t done with it, either. This was the tip of the iceberg, and we want to hear all of your feedback. Email us at firstname.lastname@example.org or tweet me @MrMattDavenport. Jyllian’s @jkemsley.
I also need to thank Bob Hill, who was one of the authors of the book Ralph Stuart mentioned in this episode earlier. He got these safety podcasts off the ground. And Carmen Nitsche of the Pistoia Alliance connected me with so many amazing people and stories. Lastly, I wanted to give a big shout-out to our C&EN colleague Bethany Halford, who helped us connect with K. Barry Sharpless before our deadline for our previous episode. Jyllian, is there anyone you’d like to thank?
Jyllian: I cringe when I think about some of the stuff that I did as an undergraduate and graduate student researcher. So many people—and a lot of them are people we’ve talked to in these episodes, but also a whole lot of other people—have very generously educated me over time. So even though I’ve become the safety person for C&EN, the only reason I have that knowledge is because of dozens of other people who have spent their time talking to me and helping me to learn. So I’m not even going to attempt to name all of them because there’s no way I could do that, but I would just like to give a collective thank you to anyone who has ever spoken to me about safety because all of those conversations have been valuable.
Matt: This week’s episode was written by me and Jyllian. I also produced the episode.
Jyllian: Our editors are Lauren Wolf and Amanda Yarnell. Sabrina Ashwell is our copyeditor.
Matt: In the first part of this podcast, you heard “Played by Ear” by Unheard Music Concepts, as well as “Compassion (Keys Version)” and “Let That Sink In” by Lee Rosevere.
Jyllian: During the second half of the episode, you heard “Thought Bubbles” by Lee Rosevere and “Glimmer” by Nihilore.
Matt: Don’t forget, if you’re not yet subscribed to Stereo Chemistry, you can rectify that on Apple Podcasts, Google Podcasts, and Spotify.
Jyllian: Thanks for listening.