An advanced avionics aircraft may offer increased safety with enhanced situational awareness. Although aircraft flight manuals (AFM) explicitly prohibit using the moving map, topography, terrain awareness, traffic, and weather datalink displays as the primary data source, these tools nonetheless give the pilot unprecedented information for enhanced situational awareness. Without a well-planned information management strategy, these tools also make it easy for an unwary pilot to slide into the complacent role of passenger in command.
Consider the pilot whose navigational information management strategy consists solely of following the magenta line on the moving map. He or she can easily fly into geographic or regulatory disaster if the straight line GPS course goes through high terrain or prohibited airspace or if the moving map display fails.
Risk is also increased when the pilot fails to monitor the systems. By failing to monitor the systems and failing to check the results of the processes, the pilot becomes detached from aircraft operation. This type of complacency led to tragedy in a 1999 aircraft accident in Colombia. A multi-engine aircraft crewed by two pilots struck the face of the Andes Mountains. Examination of their FMS revealed they entered a waypoint into the FMS incorrectly by one degree, resulting in a flightpath taking them to a point 60 nautical miles (NM) off the intended course. The pilots were equipped with the proper charts, their route was posted on the charts, and they had a paper navigation log indicating the direction of each leg. They had all the tools to manage and monitor their flight, but instead allowed the automation to fly and manage itself. The system did exactly what it was programmed to do; it flew on a programmed course into a mountain, resulting in multiple deaths. The pilots simply failed to manage the system and created their own hazard. Although this hazard was self-induced, what is notable is the risk the pilots created through their own inattention. By failing to evaluate each turn made at the direction of automation, the pilots maximized risk instead of minimizing it. In this case, an avoidable accident became a tragedy through simple pilot error and complacency.
Not only did the crew fail to fully monitor the aircraft’s automated routing, they also failed to retract the spoilers upon adding full thrust. This prevented the aircraft from outclimbing the slope of the mountain. Simulations of the accident indicate that had the aircraft had automatic spoiler retraction (spoilers automatically retract upon application of maximum thrust), or if the crew had remembered the spoilers, the aircraft probably would have missed the mountain.
Pilots en route to La Paz unwittingly deselected the very low frequency (VLF) input, thereby rendering the automation system unreliable. Although the system alerted the pilots to the ambiguity of navigation solution, the pilots perceived the alert to be computer error, and followed the course it provided anyway. They reached what they thought should be La Paz, but which was later estimated to have been approximately 30 NM away. They attempted to execute the published approach but were unable to tune the VOR radio, so they used instead the VLF of the KNS 660 to guide them on an impromptu approach. They were unable to gain visual contact with the runway environment due to in-cloud conditions despite the reported weather being clear with unrestricted visibility. Then they proceeded to their alternate about 1½ hours away. After 2½ hours of flight and following what they thought was the proper course, the aircraft became fuel critical, necessitating a controlled let-down from FL 250 to presumably visually conditions. Ironically, at about 9,000 mean sea level (MSL) they broke out of the cloud cover above an airfield. Although they attempted to align themselves for the runway, the aircraft ran out of fuel. The pilots dead-sticked the King Air to a ramp after which they broke through a fence, went over a berm, and into a pond. The aircraft was destroyed. After exiting the aircraft relatively unscathed, they found out they landed in Corumba, Brazil. [Figure 7-3]
Figure 7-3. The pilots of a King Air 200 had a flight from Bogota, Colombia, to Iquitos, Peru, (for fuel) and then to La Paz, Bolivia, as final destination. They listed Viru Viru (located at Santa Cruz, Bolivia) as their alternate. The aircraft was equipped with a Bendix King KNS 660 that provided integrated navigation solutions based upon VOR, DME, and two variants of VLF radios. At that time, GPS had not yet been integrated into the FMS.
In this accident, the pilots failed to realize that when no Omega signals were available with the VLF/Omega system, the equipment could continue to provide a navigation solution with no integrity using only the VLF system. Although the VLF/Omega system is now obsolete and has been replaced by the Global Navigation Satellite System (GNSS) and Loran-C, this accident illustrates the need for pilots of all experience levels to be thoroughly familiar with the operation of the avionics equipment being used. A pilot must not only know and understand what is being displayed, but must also be aware of what is not being displayed.
A good strategy for maintaining situational awareness of information management should include practices that help ensure that awareness is enhanced by the use of automation, not diminished. Two basic procedures are to always double-check the system and conduct verbal callouts. At a minimum, ensure the presentation makes sense. Was the correct destination fed into the navigation system? Callouts—even for single-pilot operations—are an excellent way to maintain situational awareness, as well as manage information.
Other ways to maintain situational awareness include:
- Performing a verification check of all programming. Before departure, check all information programmed while on the ground.
- Checking the flight routing. Before departure, ensure all routing matches the planned flight route. Enter the planned route and legs, to include headings and leg length, on a paper log. Use this log to evaluate what has been programmed. If the two do not match, do not assume the computer data is correct, double check the computer entry.
- Verifying waypoints.
- Making use of all onboard navigation equipment. For example, use VOR to back up GPS and vice versa.
- Matching the use of the automated system with pilot proficiency. Stay within personal limitations.
- Planning a realistic flight route to maintain situational awareness. For example, although the onboard equipment allows a direct flight from Denver, Colorado, to Destin, Florida, the likelihood of rerouting around Eglin Air Force Base’s airspace is high.
- Being ready to verify computer data entries. For example, incorrect keystrokes could lead to loss of situational awareness because the pilot may not recognize errors made during a high workload period.