The following terms appear in The Air Current’s special report on certifying artificial intelligence/machine learning in aviation and are included here as a quick reference, along with some related items. Definitions are drawn from a variety of sources and TAC’s reporting and may vary somewhat from others’ definitions of the expressions.
Agent: A software system that uses AI to autonomously perform tasks in response to predetermined goals. Agents are typically developed through a machine learning process called reinforcement learning in which the algorithm rewards positive results and punishes negative results.
Algorithm: A set of rules that defines a sequence of operations. In machine learning, it refers to the procedure used to train a machine learning model to identify patterns in training data and make predictions on new data.
ARP4754: “Guidelines for Development of Civil Aircraft and Systems,” an Aerospace Recommended Practice (ARP) document published by standards organization SAE. It provides a process for identifying and minimizing the effect of development errors in aircraft system design, and is accepted by regulators as a means of compliance to system functional and safety regulations.
ARP4761: “Guidelines for Conducting the Safety Assessment Process on Civil Aircraft, Systems, and Equipment,” an SAE Recommended Practice that provides a process for ensuring safe design of aircraft systems. It is accepted by regulators as a means of compliance to system safety regulations.
ARP6983/ED-324: “Process Standard for Development and Certification Approval of Aeronautical Products Implementing AI,” a Recommended Practice currently in development by the joint SAE/EUROCAE G-34/WG-114 working group. It would potentially provide a process to support certification of aircraft systems and equipment incorporating machine learning. The first version of the standard will be limited to frozen ML models developed through supervised learning.
Artificial intelligence (AI): The ability of a machine-based system to perform tasks typically associated with human intelligence. “AI” is also often used to refer to the specific techniques used to perform these tasks, such as expert systems or machine learning.
Artificial neural network: A computational algorithm that consists of connected nodes or “neurons” that define the order in which operations are performed on the input. Artificial neural networks identify patterns in training data and apply them to new data.
Deep neural network: An artificial neural network with multiple intermediate layers, associated with “deep learning.”
Deterministic: With respect to computer programs, a system that always generates the same output given a specific input. “Deterministic” is sometimes used to refer to conventional, logic-based software in contrast to machine learning, although a “frozen” ML model implemented in suitable hardware may also be deterministic. Initial aviation certification efforts are focused on these types of frozen models that do not continue learning after being implemented in an aircraft.
Development assurance: A set of processes that provide confidence that complex aircraft systems both perform their intended function and are free of unacceptable unintended behavior.
Development assurance level (DAL): The level of developmental rigor required for an aircraft function (FDAL) or item (IDAL). Five levels range from the most rigorous, DAL A (associated with the potential for catastrophic safety effects) to the least rigorous, DAL E (no safety effect).
DO-178: “Software Considerations in Airborne Systems and Equipment Certification,” the core document for assuring the safety of airborne software, published by the standards organization RTCA.
DO-254: “Design Assurance Guidance for Airborne Electronic Hardware,” an RTCA standard used to assure the safety of airborne hardware.
Expert systems: A branch of AI concerned with developing systems that manipulate inputs using logical rules and are meant to emulate the decision-making ability of a human expert. This form of AI was popular in the 1970s and ’80s.
Functional hazard assessment: An assessment that identifies each function on board the aircraft and the potential consequences associated with its loss or malfunction.
G-34/WG-114: The joint SAE/EUROCAE working group that is developing ARP6983/ED-324.
Generative AI: A type of AI that learns the underlying patterns of training data and uses them to generate new data, such as text, images or audio.
Industry consensus standard: A document that outlines best practices or generally accepted requirements in a given field, collaboratively developed by members of the relevant industry.
Item: In development assurance, hardware or software installed on an aircraft to meet an established requirement.
Large language model: An ML model trained on a vast amount of text and designed for natural language processing tasks such as language generation. Large language models power chatbots such as ChatGPT.
Learning assurance: As introduced by the European Union Aviation Safety Agency (EASA), a set of activities intended to provide confidence that an ML model’s training data is correct and complete and that the model can perform well on unseen data, beyond the data it was originally trained on.
Machine learning (ML): A branch of AI concerned with the development of algorithms that allow computers to learn from data, rather than following hard-coded rules.
ML constituent: A concept initially introduced by EASA and incorporated into the draft version of ARP6983/ED-324, defined as a bounded collection of hardware and/or software items, at least one of which contains an ML model. It modifies conventional development assurance practices by occupying an intermediate level between the system and item.
ML model: A computer program that has been trained using a machine learning algorithm. It receives input data and outputs a prediction or decision.
Means of compliance: An acceptable way of complying with a regulation. An industry consensus standard may serve as a means of compliance if accepted by the regulator.
Requirements: In development assurance, something that an aircraft, system or item needs to do. There are different classes of requirements (such as customer requirements or installation requirements) and not all requirements are applicable at every level.
S-18/WG-63: The joint SAE/EUROCAE committee that developed and maintains both ARP4754B and ARP4761A, both widely considered landmark documents in aircraft system safety certification and recognized by regulators as a means of compliance to system safety regulations.
Safety assessment: An assessment at the aircraft or system level that identifies safety objectives based upon the desired architecture (preliminary safety assessment) and verifies that those safety objectives are met in the implemented product (final safety assessment).
Supervised learning: A form of machine learning that uses labeled training data, which consists of sample data points along with the correct outputs or answers. For example, the training data for a visual traffic detection system may include images with objects correctly labeled by humans as airplanes, birds or drones.
System: In development assurance, a level of aircraft design organized around functions, such as the supply of electrical power by the aircraft. Systems allocate requirements to items, which implement hardware and/or software to support the function.
Training data: The data used to train a machine learning model to make useful predictions.
Subscribe to The Air Current
Our award-winning aerospace reporting combines the highest standards of journalism with the level of technical detail and rigor expected by a sophisticated industry audience.
- Exclusive reporting and analysis on the strategy and technology of flying
- Full access to our archive of industry intelligence
- We respect your time; everything we publish earns your attention


