Worldwide Catastrophe, Complexity, and the Great Norbert Wiener

El Penski, written July 2009, updated November 2009 and June 2011

In about 1958, I found myself frequently passing an old, fat man with a big cigar in his mouth in the halls of MIT. I recall thinking he was the perfect model for a drawing of a capitalist for left wing cartoonists. I had no idea who he, Norbert Wiener, was until I later saw a newspaper article with his picture. The article was about his trip to the socialist USSR which some people considered a bit subversive at the time. He was described as a great mathematician, computer scientist, and the father of cybernetics.

Norbert Wiener

Later, I attended a lecture he was giving. The lecture was well attended, but it was the strangest lecture I ever attended. He started off talking directly to the audience. He pointed out that he was greatly concerned that technology was getting so complex that nobody could understand it and, as a consequence, an unintentional accident could happen that would be a global catastrophe. After a few minutes of talking, he sat down in the front row with his back to the audience and continued talking in low voice to an associate as though he forgot that the audience was there. I was very busy at the time, so I forgot about his warning.

Several months later, I was working as a research chemist at Materials Central, Wright Patterson Air Force Base, Ohio. For my first experience to use a complex system to do some physical chemistry calculations, I tried to use the huge, advanced, analog computer faculties. I learned quickly, after talking to several experts and spending much time, that an analog computer would not solve any of my problems. In spite of much kind help, I found it to be totally unsatisfactory and gave up while wondering why anyone was using the analog computer faculties. The widely used analog computer reached its pinnacle of use about that time, and since then analog computers, devices, and signals have slowly faded from use.

Shortly thereafter, an IBM 7090, a 2.9 million dollar computer, was installed at Wright Field, and a very short FORTRAN course of three hours was offered for use of the IBM 7090. I missed the first one hour session of the course, but attended the second of three lectures. I tried to use the digital computer immediately. That same day I was repeatedly pestering the programmer that was assigned to assist me, asking a variety of questions realizing I had no idea what I was doing. Later, when I came into his office and announced his IBM 7090 gave wrong answers, he laughed. He stopped laughing after he checked my printouts. Then he said “We will dump the memory every time this problem occurs. No one has this problem but you.” A dump was then about five or more inches of printer paper of unintelligible numbers. The next day dumps were everywhere and the operator was wondering why there were so many dumps. No one had validated the simple math calculations done by the computer except me. Most of the people using the computer were aerospace engineers. I thought of Norbert Wiener's short lecture and, in the months to follow, went on to get very valuable use of the great IBM 7090.


I did not think of Wiener again until the worst nuclear power plant disaster in history occurred in 1986 in Chernobyl Nuclear Power Plant in Ukraine, USSR. Four hundred times more radioactive fallout was released than caused by the atomic bombing of Hiroshima, Japan. The operators of the plant who triggered the release were trying to do a benign test to improve the safety of the plant. Obviously, the complexity of the system overwhelmed them.


In September 2008, investment banks became insolvent, and banks stopped lending, creating a "global economic crisis." While the exact causes of this crisis will probably be debated for centuries, at the root of the problems were complex financial products, complex laws, and accounting practices that were too complex for even experts to understand. In October 2008, the Emergency Economic Stabilization Act of 2008 (the Bailout) was hurriedly signed into law.

When this disaster ends, if ever; the kind-hearted historians will probably say that financial institutions created such complex investment and mortgage products that they were incomprehensible. While some lenders, bankers, and CEOs thought they were doing good deeds, they did not understand that they were building dangerous economic bombs. For the roughly ten million people in the USA and maybe ten times as many around the world who lost their jobs, this disaster may have been the gravest in their lifetime.

Actually, when extremely complex legal and financial products are allowed, they provide a perfect camouflage for incompetence, greed, fraud, and negligence. As far as I can see, the experts and our government have learned nothing from this cataclysm.


The C-47 cargo plane was vital to the success of many WWII Allied campaigns, in Guadalcanal, New Guinea, Burma, Battle of Bastogne, flying "The Hump" from India into China, towing gliders, medical airlift, dropping paratroops, ferrying soldiers serving in the Pacific back to the United States, and later in the Berlin Airlift. The C-47 also earned the informal affectionate nickname “Gooney Bird” in the European theater of operations.

Several C-47 variants were used in the Vietnam. The Canadian military used the C-47 from the 1940s to the 1980s. Thousands of surplus C-47s were transformed to civil airline use and are still flying at the present.

When I went into the U.S. Air Force in 1959, I was told tales by older pilots of reliability of the C-47 that usually paralleled the story that follows. A C-47 was taking on damaging fire in a combat situation, and the crew decided to parachute out. After the crew landed safely in a field, the pilotless C-47 made a perfect landing in the same field. Contrast this story to the following recent event.

On June 1, 2009, an Air France Airbus A330, Flight AF447, flying from Rio de Janeiro to Paris, crashed into the Atlantic Ocean off the coast of Brazil. CBS News' aviation analyst and aviation safety expert Capt. Chesley "Sully" Sullenberger reported that there was very turbulent storm weather, it was nighttime, and the pilots could not see the earth's natural horizon. In addition, the speed sensors were yielding inconsistent results. When the aircraft stalled, the pilots had about 30 seconds to adjust the aircraft's pitch attitude and thrust setting to maintain a safe flight. They failed to react to the very complex situation in their 30 seconds, and the airplane fell like a rock killing all 216 passengers and 12 aircrew on impact. (Air France Flight 447, Wikipedia, the free encyclopedia, 1 June 2011 at 22:41)

Unfortunately, examples of disasters resulting from complexity keep occurring, and I may have to add many more to this essay. I fear that these increasing problems triggered by complexity will turn this short essay into a book. The latest example of the complexity increasing the seriousness of a natural disaster was the 2011 Tohoku earthquake which was a magnitude 9.0 (Mw) undersea mammoth earthquake 43 miles off the coast of Japan that happened on Friday, 11 March 2011, with an underwater depth of approximately 20 miles. The quake created tremendously destructive tsunami waves of up to 128 feet high that walloped Japan, in some cases generating chaos up to 6 mi inland. In addition to loss of life and destruction of infrastructure, the tsunami caused nuclear accidents at 6 nuclear reactors, of which by far the most serious was an ongoing level 7 event and about a 12 mile evacuation zone around the Fukushima I Nuclear Power Plant. The overall cost could exceed $300 billion, making it the most costly natural calamity on record.

It has been established that there were 15,457 deaths and 5,387 injured, and 7,676 people missing, as well as over 125,000 buildings damaged or destroyed. Vast structural damage included intense harm to roads and railways as well a dam collapse. Around 4.4 million households in northeastern Japan were deprived of electricity and 1.5 million without water. Many electrical generators were taken down, and at least three nuclear reactors suffered explosions due to hydrogen gas that had built up within their outer containment buildings after cooling systems failures. Residents within a 12 miles radius of the Fukushima 1 Nuclear Power Plant and a 6 mi radius of the Fukushima 2 Nuclear Power Plant were evacuated. In addition, the U.S. recommended that its citizens evacuate to 50 miles of the plant. Building of Fukushima I-7 and I-8 Nuclear Power Plants were cancelled on after the earthquake in April 2011. Fukushima I-1, I-2, I-3 and I-4 Nuclear Power Plants were decommissioned and are expected to never be commissioned again. Reactors I-1 and I-2 are still being sprayed in June 2011.

The Tohoku earthquake came as a surprise to seismologists, who were not expecting quakes above an 8.0 magnitude. The experts had assumed that the ocean containment wall would be high enough to protect the nuclear reactors. They did not expect he nuclear reactor containment systems to break, did not have a plan for long term failure of the electric supply, did not anticipate the backup electric supply to fail, and had not prepared for a severe emergency.


We all depend on experts that have devoted most of their lives to studying their field and have a lot of knowledge in their field. Nevertheless, my experience with specialists is that they still have big knowledge gaps in their area of expertise, often have baseless prejudices, and have a reluctance to admit they sometimes do not know. Very distinguished experts have proven, in spite their extensive knowledge, that they are incompetent at making predictions and assessing risks in their field although mankind is dependent on their assessments of risks and making predictions. Every time a law is passed, the unspoken prediction is that it will do more good than harm. It has been widely admitted that no one in the financial services business knows how to determine a safe level of leverage for each type of financial service.

I have to wonder if complexity offers a greater threat to mankind than climate change, disease, terrorism, or war. When governments hastily pass 2000 page unread and intelligently undebated bills, that act may be a greater hazard to mankind than the problems governments are trying to solve with the legislation.

Home to El's Research Studies