Ep. 100 Can major accidents be prevented?

Release Date:

The book explains Perrow’s theory that catastrophic accidents are inevitable in tightly coupled and complex systems. His theory predicts that failures will occur in multiple and unforeseen ways that are virtually impossible to predict. Charles B. Perrow (1925 – 2019) was an emeritus professor of sociology at Yale University and visiting professor at Stanford University. He authored several books and many articles on organizations and their impact on society. One of his most cited works is Complex Organizations: A Critical Essay, first published in 1972. Discussion Points:David and Drew reminisce about the podcast and achieving 100 episodesOutsiders from sociology, management, and engineering entered the field in the 70s and 80sPerrow was not a safety scientist, as he positioned himself against the academic establishmentPerrow’s strong bias against nuclear power weakens his writingThe 1979 near-disaster at Three Mile Island - Perrow was asked to write a report, which became the book, “Normal Accidents…”The main tenets of Perrow’s core arguments:Start with a ‘complex high-risk technology’ - aircraft, nuclear, etcTwo or more values start the accident“Interactive Complexity”787 Boeing failures - failed system + unexpected operator response lead to disasterThere will always be separate individual failures, but can we predict or prevent the ‘perfect storm’ of mulitple failures at once?Better technology is not the answerPerrow predicted complex high-risk technology to be a major part of future accidentsPerrow believed nuclear power/nuclear weapons should be abandoned - risks outweigh benefitsThree reasons people may see his theories as wrong:If you believe the risk assessments of nuclear are correct, then my theories are wrongIf they are contrary to public opinion and valuesIf safety requires more safe and error-free organizationsIf there is a safer way to run the systems outside all of the aboveThe modern takeaway is a tradeoff between adding more controls, and increased complexityThe hierarchy of designers vs operatorsWe don’t think nearly enough about the role of power- who decides vs. who actually takes the risks?There should be incentives to reduce complexity of systems and the uncertainty it createsTo answer this show’s question - not entirely, and we are constantly asking why  Quotes:“Perrow definitely wouldn’t consider himself a safety scientist, because he deliberately positioned himself against the academic establishment in safety.” - Drew“For an author whom I agree with an awful lot about, I absolutely HATE the way all of his writing is colored by…a bias against nuclear power.” - Drew[Perrow] has got a real skepticism of technological power.” - Drew"Small failures abound in big systems.” - David“So technology is both potentially a risk control, and a hazard itself, in [Perrow’s] simple language.” - David Resources:The Book – Normal accidents: Living with high-risk technologiesThe Safety of Work PodcastThe Safety of Work on LinkedInFeedback@safetyofwork

Ep. 100 Can major accidents be prevented?

Title
Ep. 100 Can major accidents be prevented?
Copyright
Release Date

flashback