System designers have labored to address these issues. But there is another danger, one that is increasingly common yet seldom discussed.
The fly-by-wire system is a modern engineering marvel. A computer linking the cockpit to aircraft control surfaces continuously analyzes pilot inputs. If the pilot pulls the control http://hvacfrederickmd57.hazblog.com/Primer-blog-b1/Nutrition-Supplement-LG-Sciences-Brings-you-A-Bigger-Hitter-Page-1-of-2-b1-p104.htm stick back, the computer recognizes that the pilot wants to climb and raises the aircrafts nose. Such maneuvers are always performed within limits deemed safe by the aircraft manufacturer. In fact, the system is so advanced that it prevents the pilot from executing maneuvers that jeopardize the aircrafts safety. The system does not however, stop a pilot from attempting to do so. That freedom still exists and it is one that is hardly unique to fly-by-wire. More and more, system designers are opting to provide technological solutions that obscure the impact of human error rather than prevent those errors from occurring. Given record low accident rates, the general sentiment seems to be if it aint broke, dont fix it.
Such thinking can be problematic, according to Greg Jamieson, a professor of industrial engineering at University of Toronto. An expert on how humans and machines interact, Mr. Jamieson questions whether or not designers should be satisfied that their inventions have minimized the impact of human error. Or is there, as he points out, an additional responsibility to ensure those errors do not occur in the first place?
Meeting this responsibility is important should technology fail. And technology does fail. In 2014, a workforce management and consulting office (10a2) technical glitch in an air traffic control system caused the grounding or delay of hundreds of flights in the United States. A similar incident occurred a year earlier in India. In that instance, radar screens that air traffic controllers rely on to direct workforce management jokes airplanes went blank for more than nine minutes. Human intervention was needed in both cases to resolve the situation.
Ensuring such intervention is safe means shifting focus from designing systems that accommodate undesirable human behavior to developing systems that change it. Such an approach has seen previous success. For example, the airline industry has long struggled with what travel writer Spud Hilton calls, luggage that is more the size of a clown car than a carry-on. Oversized carry-ons force passengers to check in their bags due to limited overhead cabin space. This, in turn, causes flight delays. Airlines are responding by imposing penalties to dissuade passengers from not complying with carry-on rules. United Airlines for example, sends some passengers back to the ticket counter to check in oversized carry-on luggage for a $25 fee. So does Toronto-based Air Canada. These policies are designed, in the words of one airline executive, to reshape passenger behavior.
When it comes to designing the next generation of technology, engineers would be wise to follow their lead.