The Problem That Regulation Is Now Catching Up To

AI tools have been entering MRO operations through side doors for several years. Technicians use general-purpose chatbots to interpret procedures. Maintenance planning teams run documents through commercial AI summarizers. Some organizations have deployed enterprise knowledge platforms — built for office environments — and adapted them, imperfectly, to hangar workflows.

Regulators noticed.

The fundamental issue is not whether AI is useful in maintenance environments. It demonstrably is. The issue is accountability: when an AI system provides a maintenance instruction, who is responsible for that instruction's accuracy? How does the organization prove — to an auditor, to an accident investigator, to the NAA — that the AI's output was grounded in approved data and not in a hallucinated synthesis of internet training material?

Until now, no regulatory framework in aviation existed to answer these questions. EASA NPA 2025-07 is the beginning of that answer.

What EASA NPA 2025-07 Establishes

NPA 2025-07, currently under formal consultation as of early 2026, addresses the trustworthiness of AI systems used in aviation maintenance and operations. While the full regulatory text will be subject to revision through the consultation process, the direction of travel is clear and the core requirements are stable enough to act on now.

The Notice of Proposed Amendment focuses on four interconnected principles:

Principle 1
Source Traceability

Any AI-generated response used to support a maintenance decision must be traceable to an approved data source. The response cannot simply be presented as correct — it must cite the specific document, revision, section, and page from which the answer was derived. This requirement directly targets the risk of AI hallucination.

Principle 2
Approved Data Boundaries

AI systems used in Part 145 environments must operate exclusively within the organization's approved documentation corpus. A system that draws on external internet data, general training corpora, or unapproved third-party sources does not satisfy this requirement — regardless of how accurate its outputs appear.

Principle 3
Auditability of AI Interactions

Organizations must be able to demonstrate, on request, what their AI systems told technicians, when, and on what basis. This implies timestamped, user-attributed records of AI queries and responses, linked to the source data used.

Principle 4
Human Authority over AI Output

NPA 2025-07 establishes that AI in safety-critical environments must support human decision-making — not substitute for it. The Accountable Manager and the signing technician retain full responsibility. The AI is a tool, not an authority.

These requirements are not hypothetical. They define the architecture that compliant AI systems in Part 145 organizations will need to demonstrate when the NPA is finalized as an Acceptable Means of Compliance (AMC).

What This Means for Your Organization Today

The consultation period exists precisely to allow organizations to engage with the regulatory process and prepare their operations. The time between an NPA and final adoption has historically been used by proactive MROs to evaluate their current tools against the incoming standard — and to avoid the cost and disruption of emergency retrofitting after the fact.

The immediate question for any Part 145 Accountable Manager or Quality Manager is: what AI tools are currently in use in your organization, and can they satisfy NPA 2025-07 requirements as written?

General-purpose AI assistants cannot satisfy source traceability requirements because they do not operate within bounded, approved documentation corpora. They generate answers from training data, not from your AMM revision 14.

Enterprise knowledge platforms adapted to aviation contexts face a similar challenge: most were designed for search and retrieval, not for citation-level traceability with regulatory documentation requirements. Adapting them to NPA 2025-07 compliance requires architectural changes their vendors have not committed to.

Document management systems with AI search layers surface documents — they do not extract and cite specific answers with the precision NPA 2025-07 requires.

The compliance gap is structural, not cosmetic. It cannot be closed with a policy update.

Building Toward Compliance — Practical Steps for Part 145 Organizations

  1. Audit your current AI use Identify every point in your operation where AI tools — formal or informal — are used to inform maintenance decisions. Include tools your technicians use without formal organizational approval.
  2. Evaluate tools against the NPA framework For each identified tool, assess whether it: (a) operates within your approved documentation only, (b) cites specific sources at the page and section level, (c) generates auditable interaction logs, and (d) was designed with regulatory environments in mind.
  3. Engage with the consultation EASA's consultation process exists to receive input from regulated organizations. Your Quality Manager or Compliance team should review the NPA text and consider submitting comments.
  4. Establish an AI governance policy Your Quality Management System should define acceptable AI use within Part 145 operations: approved tools, prohibited use cases, oversight requirements, and the human authority chain for AI-assisted decisions.
  5. Prioritize source traceability in any procurement Ask vendors specifically: can you show us, for any AI response your system generates, the exact document, revision, section, and page it came from? If the answer requires explanation or qualification, the tool does not meet the NPA standard.

DokPath and EASA NPA 2025-07

Designed for Compliance from Day One

DokPath was designed with NPA 2025-07's core requirements as a specification, not as a compliance layer to be added afterward.

Every response the system generates includes the source document, revision number, section, and exact page. The system operates exclusively within the organization's approved documentation corpus. Every technician interaction is logged with timestamp, user attribution, and the full source citation used. Compliance reports can be exported in PDF format for audit use.

This is not a sales claim. It is an architectural description. The compliance capability is built into how the system retrieves and presents information — it is not a feature that can be turned on or off.

For Part 145 organizations preparing for NPA 2025-07, the evaluation path is straightforward: request a demonstration and ask your Quality Manager to validate the source citation and audit log outputs against the NPA requirements as written.

Contact us to schedule a compliance evaluation
Disclaimer This article reflects the state of EASA NPA 2025-07 as of the consultation period open in early 2026. Organizations should consult the official EASA documentation and their regulatory advisors for guidance specific to their approval certificates and operational context.