Issue 440 | September 2016 |
In modern day aircraft, automation is a reality that facilitates procedures and allows precision never before achieved in flight operations. Thorough systems knowledge and keen management skills are required to operate the automation effectively. To that end, the aviation community has proactively educated itself, honed its skills, and created new paradigms. Many improvements have been made, but pilots are human and automation is complex. Automation can clearly improve flight safety, but may also spawn new opportunities to err.
Automation errors may occur in almost any flight regime. Operational programming errors are common. Errors suggesting a lack of knowledge or understanding are less frequently reported. ASRS often receives reports suggesting that aircrews believe their automation is accomplishing a desired task when, in actuality, it is not. As aircrews rely more exclusively on automation, a tendency can arise to place more trust in it than may be prudent. Perhaps the most interesting of the complex automation phenomena reported are of the human factors type. They are central to the complicated relationships existing between situational awareness, judgment, and automation management that quicken the human vulnerability to become lulled into a false sense of security and think that, “the automation has it.”
This month, CALLBACK looks at a small sample of incidents that describe reduced awareness, dependency, overreliance, and management errors that occur with automation. You can see how the incidents developed and can project how they may have concluded had the errors not been discovered.
How Low Should You Go?
This B737 Crew encountered a ramp hazard that is not uncommon, but got a surprise that grounded the aircraft, in part, because local authorities had altered the airport facility.
From the Captain's report:
■ We were cleared to descend via the arrival landing south. As the Pilot Monitoring (PM), I set the lowest altitude on that STAR, which was 6,000 feet, and…then accidentally abrogated my PM duties by not stating, “I’ll set the next lowest altitude of FL220,” as we approached [the altitude restricted fix] in Level Change pitch mode. Already high on the profile and well above crossing restrictions, it wasn’t of immediate concern, but [it was] completely improper procedure on my part. Instead of correcting that, I passed the radios to the First Officer as I took to the [public address (PA) system] to offer a good-bye to our customers.
[After I finished] with the PA, I reported, “Back on number 1 radio,” to the First Officer, who had switched us to Approach but had not yet checked in. I…checked in and reported, “Descending via the…arrival.” I did not refer to the Primary Flight Display (PFD) to check what pitch mode we were in, but the Controller said, “Climb and maintain 10,000 feet.” We were on a STAR, and this was such an unusual call.… I said, “Say again,” and the Controller unemotionally repeated, “Climb and maintain 10,000 feet.” We complied immediately. By that time I saw that the bottom [altitude] window of the next fix showed 10,000. The Controller then asked, “Why were you down at 6,000 feet?” I said, “My bust,” as there was no excuse for this performance.
I had been relying on the VNAV automation instead of the old fashioned, “Set the next lowest altitude,” which forces both pilots [to be] situationally aware with respect to the profile. I was allured by the pure beauty of a clear Spring day and was obviously much less aware than I needed to be.
From the First Officer’s report:
■ The Captain set 6,000 feet into the MCP altitude window, and we both verified it against the bottom altitude of the arrival.… The Captain [reported to Approach Control] that we were descending via the arrival. At this point I simply was not looking at our displays and a very short time later, we were told to climb to 10,000 feet from our current altitude of 6,000 feet.… I knew right away that we never got back into VNAV path for protection.
Teetering on the Approach
A Gulfstream Captain, experiencing strong winds during an approach, became fixated on the automation’s correction. He then lost sight of his own situation and the airport.
■ During the arrival into Teterboro, we were cleared for the ILS to Runway 6. The Pilot in Command (PIC) let the autopilot drift left of the center line and [I told him] that the airport was in sight at one o’clock. The PIC’s comment was, “Look at how much correction this thing is putting in.” We continued to drift left. I told him again that the center line was to the right and that the airport was in sight. The PIC turned right and started to descend. Then he said that he had lost sight of the [airport]. I told him that the airport was at eleven o’clock and that he was way too low for where we were. I [pointed out] the towers south of [the airport] to him twice. He then said he had them and asked where the stadium was. At this time the tower came on the frequency and gave us a low altitude alert. The airport was at our ten o’clock position, but at this point, I lost sight of the airport and told the PIC to go around. At that point, we both picked up the airport visually and landed without further incident.
The trip was extremely rough and had been for the preceding 20 minutes. The wind at 4,000 feet was out of the northwest at 65 knots. The [reported] landing wind was from 330 [degrees] at 19 [knots, gusting to] 25 [knots]. This [is] a classic example of how automation dependency can cause a very experienced pilot to lose track of situational awareness and ignore the basics of flying the aircraft.
A Descending STAR
A Gulfstream aircrew was given two runway changes during the arrival, and the automation did not quite lead them down the correct vertical path.
■ The FMS was programmed with the arrival, and VNAV was selected. All seemed well as we descended to, and crossed, HOMRR at 16,000 feet and 250 knots. However, the next fix, VNNOM, required crossing between 11,000 feet and 10,000 feet. VNNOM is 4.1 nautical miles from HOMRR. Crossing HOMRR at 16,000 feet, we realized that it was almost impossible to lose 5,000 to 6,000 feet in 4.1 nautical miles. At this point I clicked off the automation and pointed the nose down, achieving a descent rate of better than 6,000 feet per minute. Our airspeed increased to 280 knots, and we crossed VNNOM high and fast.
The STAR called for crossing HOMRR at or below 16,000 feet, and the FMS should have been in a position to make the next subsequent fix. Obviously we could have done a better job monitoring the situation.… We made, programmed, and verified two runway and approach changes during this descent prior to HOMRR. In fact, the first change went from a landing east flow to a landing west flow. This could actually explain why the FMS logic chose 16,000 feet at HOMRR instead of lower.… Landing east on the EAGUL FIVE requires crossing [the next fix] immediately past HOMRR between 15,000 feet and 14,000 feet.
This is a really poorly designed STAR. Something should be done to warn other aircrews not to fall into the same trap.1
The Virtual Green Flash
Automation dependency also exists in the ATC environment. A Center Controller, while using an automated hand-off procedure, “flashed” several aircraft to incorrect sectors. This alert Controller noticed the problem, bypassed the automation, and minimized the airspace violation.
■ I was working Sector XX, R-Side and D-Side combined. Traffic was moderately busy and we had overflights available through the [airspace] which [adds] some complexity. I was flashing several aircraft to Approach to initiate our flash-through procedure. The automation forwarded the handoffs [incorrectly] to Sector YYG instead of YYB. [Initially,] I did not notice that in my scan, and one of the aircraft penetrated [the adjacent sector’s] boundary without a handoff having been completed. I called Sector YYB for the late point-out and redirected the [automated, incorrect] handoff from Sector YYG to YYB. The Controller there took the handoff and flashed it on to Sector ZZ.
This is a repeated problem with YY Approach's automation. I would recommend their automation be forwarded correctly so the appropriate sector sees the handoff flashing at them.
More Than Meets the Eye
This B737 aircrew trusted their automation to calculate the descent point, but they did not consider the winds. The situation was compounded as a second problem resulted from the action they took to solve the first.
From the First Officer's report:
■ [We were] given the crossing restriction 10 [miles] north of HIELY at 13,000 feet. [I] got behind on the descent, asked for relief, and the Controller gave us a heading and a descent to 13,000 feet. [We] entered moderate chop, and I oversped [the aircraft about] 5 knots or so [in the] clean configuration. I was…rushing to comply, and, along with chop, I got behind the aircraft. I need to do a better job cross checking the automation against what the restrictions actually are. I was trusting in the automation too much for when to start my descent.
From the Captain's report:
■ [Our mistake was] overreliance on the automation for planning the descent. [We should have] double checked that it makes sense with the winds and should have been more aware of speed control when using vertical speed to try to comply with a crossing restriction.
1 The EAGUL FIVE arrival is now the EAGUL SIX arrival, and the altitude restrictions have been changed so that the descent path is much more tenable given a runway change just prior to HOMRR.
July 2016 Report Intake | |
---|---|
Air Carrier/Air Taxi Pilots | 4,742 |
General Aviation Pilots | 1,124 |
Controllers | 660 |
Flight Attendants | 543 |
Military/Other | 304 |
Mechanics | 155 |
Dispatchers | 148 |
TOTAL | 7,676 |
A Monthly Safety Newsletter from The Office of the NASA Aviation Safety Reporting System
P.O. Box 189 | Moffett Field, CA | 94035-0189
http://asrs.arc.nasa.gov