Lessons from Japan...
Posted: Wed Sep 06, 2017 4:57 pm
I'm deeply saddened by the events that devastated Japan during the Tsunami and resulting Fukushima crisis, a deadly 7.1 magnitude earthquake, followed by a massive tsunami, and then a nuclear meltdown, and subsequent aftershocks. One can only imagine the peril in evacuating your home (in a moment's notice) with the possibility of never being able to return. The deepest shock comes in losing all of your personal possessions, including pictures of your past that aren't yet digitized and stored in an independent location.
One of the things that gets to me most is the handling of the nuclear crisis, and how propaganda has been spread about radiation from its failure. Radiation is an unseen killer, its in nearly every home in the form of electronics, your cell phone, your microwave oven, and it possibly provides electricity to your community. Long forgotten is the incidents that occurred on Three Mile Island in Pennsylvania right here in the US that occurred in 1979.
The big problem with nuclear incidents is that they don't manifest until long after the incident occurred for most people. How can we comprehend exactly how radiation hurts us if it comes on in such a subtle (and delayed) manner? I suggest that the news media carefully tracks the people who work on resolving these "meltdown" incidents (The people closest to the source). Are the people who worked on-site to fix the plants at Three Mile Island and Chernobyl even still alive? These details we rarely hear. We're lulled into submission simply by abandoning meltdown zones like Chernobyl, and lobbyists quickly change the discussion to how "safe nuclear power plants can be" until we forget to pursue the truth because our lights are running again.
How can we trust a government that assures us that they've got incidents like this under control, while still hearing conflicting reports on the news about how the plant (as of today) has reached Chernobyl meltdown status?
As engineers and scientists, we all have to delve into technology carefully. We are at the mercy of scientists and what we don't know. We can't allow ourselves to get arrogant because our knowledge about unforeseen future events in the course of innovation can prove to be deadly. How can arrogance contribute to deaths you ask? Let me remind you with a few words: The Hindenburg, Titanic, Hiroshima, Exxon Valdez, harmful effects of asbestos (used in a wide variety of building materials since the 50s), lead in paint and gas, The Ford Pinto, (More recently) the epic DeepWater Horizon oil spill, tainted milk from China, sticking accelerators on Toyota Priuses, construction of the World Trade Center, pretty much every airplane crash ever, and many other incidents that had significant environmental and life/casualty impacts.
Scientists and engineers often don't paint a picture in our minds of a guy driving in a hummer (with a tight Ed Hardy shirt on) listening to euro-house (the picture of over-confidence in a bad idea). But arrogance can come in the form of believing that (currently-existing) "proven" scientific concepts are "rigid" and "infallible". The same goes for religion, but that's another discussion entirely. For the purposes of this discussion, The Titanic was launched as an unsinkable ship, well, after watching the James Cameron drama fest under the same name, the ship clearly sank because it hit an ice berg. Man - 0, Science - 0, Ice Bergs - 1. That is one of the most significant cases of hubris over things that any normal person at the time could not foresee.
We have to be aware of the unforeseen conditions, unimaginable knowledge, and unpredictable circumstances, and that are always out of our control (I like to just call these things future circumstances). The key is creating firm back-up plans when lives are at stake. Technology should always be driven by not only a deep innovative pulse, but a pulse of versatility, safety and multiple-dimension thinking. Why is is so common for us, as humans to create backup systems that fail? On the DeepWater Horizon oil rig catastrophe, there were several scientists that could not seem to solve the crisis, including Bill Nye (Playing a Scientist on TV). The incident after the initial explosion was made even more dramatic by a failed blowout preventer another scientific/engineering backup plan that failed miserably. Let me now point you to this video, from 1979 where the EXACT SAME INCIDENT OCCURRED:
What have we learned since then? Well, as this video shows, not much... The crisis was solved using the exact same method we used in 1979. Had the news media reported on this as a first impulse, we would have saved months of failed efforts and dedicated efforts towards, this proven method of solving this oil spill crisis. In hindsight, a scientist would tell you they already knew that, probably.
We have to improve our methods of preventing crisis, by thinking of the best, but preparing for the worst, we have to work hard to keep our egos in check. A Darwin Award is not quite something you can prevent, but arrogance in science that impacts lives IS.
Ford Pinto you say? During the 70s, The Ford Pinto was cited as one of the most dangerous cars of all time:
This car (named after a horse) is noted for exploding into a fiery mass even on simple low-speed rear-end collisions. It was designed poorly by scientists and engineers just like you (not me of course).
What's driving this hubris? We'll based on the last video, I'd have to say its profit...
Companies get so enthralled into watching stock valuations, profits, and (these days) database-driven metrics. They get "tunnel vision". Companies look at the bottom line, and not how its reached. By keeping that narrow view, companies often lose a sense of what builds a firm basis of customer satisfaction, brand loyalty, and a firm safety record. This provides a ripe breeding ground for the arrogant/overconfident scientist or engineer to step in and conduct their "errant process" of creating failure prone solutions and products, without many checks and balances. Too many products are run to a finish line with flaws, some all too obvious... In modern times, companies don't worry about their reputations beyond a large public outcry. All too often its only the Internet that helps companies to not forget their failures, and that may fade with our recent moves towards a "closed web" where companies will end up owning the only channels of communication. If homogeneous companies own all of the sites that they're promoted and reviewed on, you'll only see good reviews, and not the bad ones, or vital independent information that could prevent you from getting killed in a rear-ender-fireball-blaze (after unfortunately paying way too much) for a used Pinto.
As a closing thought, I get vibes of great amazement whenever I encounter videos from a bygone era that predict the future. There are tons of wondrous videos from the 1920s, 30s, and 40s, that predict the year 2000. Now that we've passed 2000, pretty much all of the innovations in these videos have proven to be off-base and some are so wildly off the mark, you can't help but laugh at them.
We have to realize that those of us here now (predicting the future) are in the same boat as scientists and engineers from those long-gone eras, we have to equate for hubris, and really begin to make inspired products that incorporate new angles of safety, corporate responsibility, and innovation. We also have to proof-read our statements and scan them for unforeseen scientific "arrogance" based on assumptions of facts that will last into the future. We also have to invent new processes for handling incidents that we'd never imagine possible, (like an earthquake followed by a tsunami, and then nuclear crisis). We'll also have to account for incidents in the past and immediately refer to them when similar crises like them occur (I suggest more publicly available database driven info resources). We wasted so much time on the Deep horizon oil spill trying ideas that the impact was made even more significant, when all we had to do was to use past methods). Most of all, we need to create firm back-up plans for all of our engineering goals, and to always plan for failure, even in more than one stage.
Thinking from outside our own perspectives offers us a much more robust outlook on life. In the age of individualism, we need to remind ourselves of our responsibility to society, and that solutions to problems will never be good if they are viewed from one, or even only a few angles; solutions need to be viewed from ALL angles. Companies should worry less about earning for shareholders and more about building valuable and reliably safe products when lives are at stake. I guarantee this new outlook will save you tons of scandal and gallons of useless apologies, none of which will fix a nuclear power plant in meltdown nor the lives affected by it.
One of the things that gets to me most is the handling of the nuclear crisis, and how propaganda has been spread about radiation from its failure. Radiation is an unseen killer, its in nearly every home in the form of electronics, your cell phone, your microwave oven, and it possibly provides electricity to your community. Long forgotten is the incidents that occurred on Three Mile Island in Pennsylvania right here in the US that occurred in 1979.
The big problem with nuclear incidents is that they don't manifest until long after the incident occurred for most people. How can we comprehend exactly how radiation hurts us if it comes on in such a subtle (and delayed) manner? I suggest that the news media carefully tracks the people who work on resolving these "meltdown" incidents (The people closest to the source). Are the people who worked on-site to fix the plants at Three Mile Island and Chernobyl even still alive? These details we rarely hear. We're lulled into submission simply by abandoning meltdown zones like Chernobyl, and lobbyists quickly change the discussion to how "safe nuclear power plants can be" until we forget to pursue the truth because our lights are running again.
How can we trust a government that assures us that they've got incidents like this under control, while still hearing conflicting reports on the news about how the plant (as of today) has reached Chernobyl meltdown status?
As engineers and scientists, we all have to delve into technology carefully. We are at the mercy of scientists and what we don't know. We can't allow ourselves to get arrogant because our knowledge about unforeseen future events in the course of innovation can prove to be deadly. How can arrogance contribute to deaths you ask? Let me remind you with a few words: The Hindenburg, Titanic, Hiroshima, Exxon Valdez, harmful effects of asbestos (used in a wide variety of building materials since the 50s), lead in paint and gas, The Ford Pinto, (More recently) the epic DeepWater Horizon oil spill, tainted milk from China, sticking accelerators on Toyota Priuses, construction of the World Trade Center, pretty much every airplane crash ever, and many other incidents that had significant environmental and life/casualty impacts.
Scientists and engineers often don't paint a picture in our minds of a guy driving in a hummer (with a tight Ed Hardy shirt on) listening to euro-house (the picture of over-confidence in a bad idea). But arrogance can come in the form of believing that (currently-existing) "proven" scientific concepts are "rigid" and "infallible". The same goes for religion, but that's another discussion entirely. For the purposes of this discussion, The Titanic was launched as an unsinkable ship, well, after watching the James Cameron drama fest under the same name, the ship clearly sank because it hit an ice berg. Man - 0, Science - 0, Ice Bergs - 1. That is one of the most significant cases of hubris over things that any normal person at the time could not foresee.
We have to be aware of the unforeseen conditions, unimaginable knowledge, and unpredictable circumstances, and that are always out of our control (I like to just call these things future circumstances). The key is creating firm back-up plans when lives are at stake. Technology should always be driven by not only a deep innovative pulse, but a pulse of versatility, safety and multiple-dimension thinking. Why is is so common for us, as humans to create backup systems that fail? On the DeepWater Horizon oil rig catastrophe, there were several scientists that could not seem to solve the crisis, including Bill Nye (Playing a Scientist on TV). The incident after the initial explosion was made even more dramatic by a failed blowout preventer another scientific/engineering backup plan that failed miserably. Let me now point you to this video, from 1979 where the EXACT SAME INCIDENT OCCURRED:
What have we learned since then? Well, as this video shows, not much... The crisis was solved using the exact same method we used in 1979. Had the news media reported on this as a first impulse, we would have saved months of failed efforts and dedicated efforts towards, this proven method of solving this oil spill crisis. In hindsight, a scientist would tell you they already knew that, probably.
We have to improve our methods of preventing crisis, by thinking of the best, but preparing for the worst, we have to work hard to keep our egos in check. A Darwin Award is not quite something you can prevent, but arrogance in science that impacts lives IS.
Ford Pinto you say? During the 70s, The Ford Pinto was cited as one of the most dangerous cars of all time:
This car (named after a horse) is noted for exploding into a fiery mass even on simple low-speed rear-end collisions. It was designed poorly by scientists and engineers just like you (not me of course).
What's driving this hubris? We'll based on the last video, I'd have to say its profit...
Companies get so enthralled into watching stock valuations, profits, and (these days) database-driven metrics. They get "tunnel vision". Companies look at the bottom line, and not how its reached. By keeping that narrow view, companies often lose a sense of what builds a firm basis of customer satisfaction, brand loyalty, and a firm safety record. This provides a ripe breeding ground for the arrogant/overconfident scientist or engineer to step in and conduct their "errant process" of creating failure prone solutions and products, without many checks and balances. Too many products are run to a finish line with flaws, some all too obvious... In modern times, companies don't worry about their reputations beyond a large public outcry. All too often its only the Internet that helps companies to not forget their failures, and that may fade with our recent moves towards a "closed web" where companies will end up owning the only channels of communication. If homogeneous companies own all of the sites that they're promoted and reviewed on, you'll only see good reviews, and not the bad ones, or vital independent information that could prevent you from getting killed in a rear-ender-fireball-blaze (after unfortunately paying way too much) for a used Pinto.
As a closing thought, I get vibes of great amazement whenever I encounter videos from a bygone era that predict the future. There are tons of wondrous videos from the 1920s, 30s, and 40s, that predict the year 2000. Now that we've passed 2000, pretty much all of the innovations in these videos have proven to be off-base and some are so wildly off the mark, you can't help but laugh at them.
We have to realize that those of us here now (predicting the future) are in the same boat as scientists and engineers from those long-gone eras, we have to equate for hubris, and really begin to make inspired products that incorporate new angles of safety, corporate responsibility, and innovation. We also have to proof-read our statements and scan them for unforeseen scientific "arrogance" based on assumptions of facts that will last into the future. We also have to invent new processes for handling incidents that we'd never imagine possible, (like an earthquake followed by a tsunami, and then nuclear crisis). We'll also have to account for incidents in the past and immediately refer to them when similar crises like them occur (I suggest more publicly available database driven info resources). We wasted so much time on the Deep horizon oil spill trying ideas that the impact was made even more significant, when all we had to do was to use past methods). Most of all, we need to create firm back-up plans for all of our engineering goals, and to always plan for failure, even in more than one stage.
Thinking from outside our own perspectives offers us a much more robust outlook on life. In the age of individualism, we need to remind ourselves of our responsibility to society, and that solutions to problems will never be good if they are viewed from one, or even only a few angles; solutions need to be viewed from ALL angles. Companies should worry less about earning for shareholders and more about building valuable and reliably safe products when lives are at stake. I guarantee this new outlook will save you tons of scandal and gallons of useless apologies, none of which will fix a nuclear power plant in meltdown nor the lives affected by it.