Log-in here if you’re already a subscriber
When it comes to communication between flight crews and air traffic control, voice transmissions over radio have a lot of drawbacks. The sound quality is often poor, which can make transmissions hard to understand. Flight crews engaged in critical tasks must remain constantly alert for their call sign and relevant advisories while filtering out unrelated chatter. And frequency congestion can make it difficult to communicate important information in a timely fashion.
All of these are compelling reasons for moving air traffic management to a system primarily based on digital data. Because a wholesale transformation of the voice-based ATC system is unlikely to happen anytime soon, however, some researchers are exploring how to use natural language processing (NLP) — a branch of artificial intelligence focused on understanding human speech and writing — to reduce workload for pilots and air traffic controllers.
Related: TAC/Forum: Understanding AI in aviation
NLP in the form of automatic speech recognition featured on the Airbus UpNext DragonFly demonstrator, an A350-1000 equipped with multiple cutting-edge technologies including an automated emergency diversion function and automatic landing capability that were tested in flight last year. Automatic speech recognition was used for several purposes on the demonstrator, including transcribing the automatic terminal information service (ATIS) broadcast.
More notably, it was used to transcribe a controller’s taxi instructions and overlay them on a cockpit display of the airport environment — showing potential to improve pilots’ situational awareness and help prevent runway incursions.
Subscribe to continue reading...