Norman Augustine: Simple Systems and Other Myths

I just found a printed handout from this Norman Augustine lecture and there’s no trace of it on the internet.  So, I’ve typed it in for you.  — Charvak

Excerpts from the Dec. 5 inaugural Brunel Lecture Series in Complex Systems

The fact is there’s no such thing as a simple system.  And, complex systems are made particularly challenging because the interactions in those systems are so easily overlooked or misunderstood.

Today, I’d like to share with you some lessons from my own experience in systems engineering.

1. The first lesson that I would like to cite is the importance of taking a broad view of what constitutes a system.  Remember Andrew–a particularly devastating hurricane that hit Florida a few years ago?  When the hurricane hit, the telephone companies were having difficulty getting the telephone system back in operation for the first couple of days.  The reason was not lack of wire or trucks or switching centers.  The recovery was stopped by the lac of child care centers.  Most telephone employees come from two-wage earner families and when the hurricane knocked out all the child care centers, one parent had to stay home with the children.  Only about half the work force showed the day they so badly needed all the work force and more.  The telephone company quickly called in their retirees who set up day care centers so the regular work force could work.  The problem, of course, was that they had too narrowly defined the system.

2. The next major lesson is bound the problem.  This may sound contradictory to the first lesson.  It is, but no one ever said that systems engineering was easy!  To true transportation engineers, the air transport system is really only a part of the transportation infrastructure.  They would be thinking of highways and ships and perhaps even how to move information.  The challenge for systems engineers is to determine what could be the boundaries of the system for practical purposes.  If you included too little in the system, it makes the system untenable; if you include too much, it makes it unanalyzable.

3. The next lesson is to watch for unintended consequences, sometimes these can be extremely subtle connections or interactions.  The first day I worked when I was out of college, a wise, old engineer said, “No change is a small change.”  If I’d paid attention, I could have saved my employer billions of dollars with mistakes I’ve made by not realizing there’s no such thing as a small change.  Let me illustrate by referring to the Standard ARM, an anti-radar missile used during the Vietnam War.  It had been tested extensively and performed beautifully.  In Vietnam, it was flown against radars in North Vietnam and, at 16 seconds time of flight, they would all blow up.  Back at the test range, they worked fine.  Investigations found nothing.  The only difference between the missiles that went to the fleet and the test ones was that one had an actual explosive in the nose, whereas the other had an inert warhead.  On the one that went to the fleet, someone put a sticker on the side of the missile that said, “Live round.”  Investigators discovered that the glue that held the sticker on would debond at the temperature the missile reached at about 16 seconds in flight.  In wind tunnel models, the stickers would blow up through a strange airflow right through the guard beams and detonate the warhead.

4. The next one, question everything, is a lesson that Warren Buffett told me before I was to teach my first class.  He said the most important lesson that I could teach my students was to always have someone around who could tell the Emperor he has no clothes.  It’s very good advice to always have someone around, preferably yourself, but others too, who can challenge what you’re doing.  Prior experience and inborn biases can cause a person to have very fuzzy vision.  And if prior experience can cause fuzzy vision, arrogance can cause absolute blindness.

What’s the danger here?  When the Hubble Space Telescope was first launched, the media called it the Near-Sighted Mr. Magoo.  The subcontractor who built the optics for this telescope, one of the finest optical manufacturers in the world, built the flight article and then tested it.  When they tested it, they got a pattern that indicated a huge error.  They said to themselves there’s no way we could have made an error like that.  Something’s wrong with the test.  So the telescope went in orbit and it turned out they had made a huge error.  Fortunately, NASA was able to put a set of corrective eyeglasses on the telescope and eventually we got sensational pictures.

5. Another lesson is the things you worry about usually aren’t the things that do you in.  That’s because you pay attention to them and you can usually solve them.  Rather you tend to have your lunch eaten by things that were overlooked or that you thought were under control but were not.  This past year, I served on a commission to review the Osprey, a tilt-rotor aircraft.  Our question was whether the concept of having an aircraft that’s both helicopter and fixed-wing in performance might be fundamentally flawed.  Aircraft of this type had a tragic record: five crashes, 22 Marines killed the prior year alone.  Everybody was very focused on this complex rotor system.  We dug into the five crashes.  If my memory serves me correctly, there were three unrelated mechanical problems, one maintenance error, and one pilot error.  None had much to do with the very complex new concept.

6. Then, watch for details that will get you if you don’t watch out.  There are some painful examples of not getting the details right.  Remember the Mariner spacecraft that we built for Jet Propulsion Laboratory, one of the finest technical organizations in the world?  To our great embarrassment, NASA and we were working in different units–English and metric–and we lost that spacecraft.

7. The next lesson is treasure your anomalies.  While the details can hurt, they can also help.  In that Mariner mission, there were a number of earlier corrections on the way to Mars that all had the same directional bias, which is peculiar.  Had we questioned that, we might possibly have discovered that we were working in different units, but nobody challenged it.  They just put in a correction.

8. Then, a lot of redundant systems aren’t redundant.  An L1011 with three engines was flying from Miami to Nassau.  It had an oil loss warning indicator on one engine, so the pilot turned around to fly back to Miami.  On the way back, the second engine gave a no-oil indication, then the third engine.  The pilot said there’s no way that you can have three separate engines with totally separate oil systems all lose their oil on the same flight.  Well, it turns out there is a way.  Just before they had left Miami, the maintenance people had put a new chip detector in each engine.  The chip detectors had all come from the same supplier, who had left off an O-ring in the assembly process.  Happily, the engines didn’t fail.

9. Finally, complex systems often involve human components.  When I was assistant secretary of the Army for research and development, we were producing the Pershing Missile.  The engineers realized that somebody could accidentally get two huge cables reversed and that would be very bad.  So, they designed one cable bundle with a 16-pin connector and the other with an 18-pin connector.  The only problem was the strongest soldier in the United States Army forced a 16-pin connector into an 18-pin connection.  A fire followed.

Today, I shared with you nine lessons from my own experience in systems engineering.  You might say, why not ten?  The reason is because I’m sure that everyone–particularly the old hands in the audience–have one of their own to add.