Let's skip the preamble. You're reading this because you've seen something in a tender document that you weren't expecting — a prequalification question about AI governance, a clause asking for an AI tool register, or a requirement to declare your organisation's position on sovereign AI. And you're wondering whether this is real or just another compliance checkbox that nobody actually scores.
It's real. It's being scored. And if you don't have a documented position, you're already behind.
What Changed in 2025–2026
For most of the past decade, AI governance was a concern for tech companies and financial services firms. Infrastructure and heavy industry were largely left alone. That changed in 2025, and it changed fast.
The federal government's National AI Plan created a policy foundation that has since flowed directly into procurement. The Digital Transformation Agency's Policy for Responsible AI — now updated — established minimum expectations for how government agencies acquire technology and services. The DTA Model AI Clauses v2.0 gave procurement teams templated contract language they could drop straight into tender documents without needing specialist legal help to draft from scratch.
That last point matters more than it sounds. When the clause language is pre-written and ready to use, agencies use it. And they have been.
The Guidance for AI Adoption in Australian Government — known as GfAA — introduced a framework of essential practices. The specific practice attracting the most attention in supply chain procurement is AI6: accountability and oversight. When you see tender prequalification questions referencing "essential practices" language, that's where it comes from. AI6 is the benchmark evaluators are working against, whether or not the document names it explicitly.
NSW Government moved early. Federal agencies in Defence, Energy, and Infrastructure followed. The Security of Critical Infrastructure Act obligations have created additional pressure on asset owners to demonstrate they understand what AI tools are operating within their supply chains. The result is that governance requirements that looked voluntary twelve months ago are now baked into prequalification criteria.
The shift from optional to mandatory didn't happen with a single announcement. It happened clause by clause, tender by tender — and most contractors missed it until they were already being filtered out.
The 10 December 2026 deadline under the National AI Plan grace period is the next forcing function. Organisations that have treated AI governance as a future problem are running out of runway.
What Tender Documents Are Actually Asking
Here's what's appearing in current tender prequalification documents across Defence, Energy, and Infrastructure verticals. Not theory — actual clause categories showing up in RFTs and EOIs right now.
AI tool register. Can you produce a current register of AI tools in use on the project or within your organisation? Evaluators want to know what's running. A spreadsheet listing the tools, their purpose, their vendor, and who approved their use is the minimum. Many contractors can't produce this because nobody has ever formalised it.
Data residency position. Where does data processed by your AI tools reside? This is particularly live in Defence and critical infrastructure contexts. The question isn't whether you use cloud-based AI tools — almost everyone does. The question is whether you know where the data goes and whether that's consistent with project security requirements.
Sovereign AI statement. This is newer and causes the most confusion. A sovereign AI statement documents your organisation's position on the use of foreign-hosted AI infrastructure for sensitive project data. It doesn't require you to have solved the problem — it requires you to have thought about it and documented your position.
AI use disclosure policy. Do you have a written policy governing when and how staff disclose AI use in project deliverables? Tender evaluators are specifically asking whether AI-generated content in reports, designs, or recommendations is identified as such. This connects directly to the accountability requirements in AI6.
Accountability structures. Who in your organisation is accountable for AI governance decisions? This doesn't need to be a dedicated Chief AI Officer. It needs to be a named person with documented responsibility. Many smaller contractors fail here because the honest answer is "no one" — which isn't a viable answer in a scored evaluation.
Supply chain flow-down provisions. If you're a prime, are your subcontractors bound by the same AI governance requirements? Evaluators are asking primes to demonstrate that their governance position flows down through the supply chain. This is where Tier 2 and 3 contractors are increasingly caught out — more on that below.
The Audit Problem: Pre-Qual Is Not the Finish Line
Here's where it gets serious. Procurement teams aren't just reading your written responses and moving on. After shortlisting, a growing number of Defence and Infrastructure clients are conducting post-shortlist AI governance audits.
An audit is not a questionnaire. It's a conversation, and sometimes a document review. The evaluator sits across the table from someone in your organisation and asks them to walk through your AI governance documentation. If the documentation was written by your bid team the night before submission and nobody in the business has actually seen it, that becomes apparent very quickly.
You can write a compelling response to a prequalification question. You cannot fake an audit.
Forty-five per cent of tenderers are being marked down for generic answers. "We take AI governance seriously and follow industry best practices" is not an answer. It's a flag that you don't have one.
Specific answers win. An evaluator reading your response should be able to identify: which AI tools your organisation uses, who approved them, who's accountable if something goes wrong, how data is handled, and how that governance position flows to your subcontractors. If your response could belong to any company in Australia, it's too generic to score well.
The audit dimension also means that getting through prequalification with a strong written response doesn't protect you if you can't back it up. Contracts have been withdrawn after post-shortlist audits revealed that the governance documentation didn't reflect actual practice. That's a reputational problem that outlasts the tender.
What a Compliant Answer Looks Like
The good news: this isn't technically difficult. The documentation required to answer these questions well is not complex. It just has to be real — meaning it reflects how your organisation actually operates, and it has to exist before the tender drops.
A defensible AI governance position for a heavy industry contractor typically covers five areas:
- An AI tool register — a maintained list of approved AI tools, their business purpose, data inputs, vendor details, and the internal approval date. This can live in a spreadsheet. It needs to be current and owned by a named person.
- A data handling summary — a brief document that maps your key AI tools to their data residency position. Where does the data go? Is that consistent with your project obligations? Have you checked vendor terms? This is not a technical architecture diagram — it's a one or two page summary that a non-technical evaluator can read and understand.
- An AI use policy — typically two to four pages. Covers: which tools are approved for use, who can approve new tools, how AI-generated content is disclosed in deliverables, what's prohibited (e.g., feeding client data into unapproved tools), and what happens when someone breaches the policy.
- Accountability mapping — a simple document that names the person accountable for AI governance decisions and describes how issues are escalated. On a small business, this might be the Managing Director. On a larger organisation, it might be the General Manager of Technology or Operations. The name and the role matter more than the title.
- A sovereign AI statement — a short, plain-English document that sets out your organisation's position on where sensitive project data can and cannot be processed. It can acknowledge that you use offshore-hosted tools for some purposes while committing to specific controls for sensitive project contexts. Honesty is better than overclaiming.
None of this requires an AI ethics specialist or a law firm. It requires someone to sit down for a day or two, work through each area honestly, and produce documents that reflect what actually happens inside your organisation.
The GfAA essential practices — AI6 in particular — give you a useful checklist for what accountability and oversight documentation should address. If your documentation can demonstrate alignment with AI6 requirements, you're in a strong position to answer the majority of what tender documents are currently asking.
The Supply Chain Dimension — What Primes Are Requiring of Subs
If you're a Tier 2 or Tier 3 subcontractor, there's a dynamic playing out that directly affects you even if you're not responding to government tenders yourself.
Primes are being required to demonstrate that their AI governance position flows down through the supply chain. This means primes are now including AI governance requirements in their subcontract agreements. Some are doing it through formal contract clauses. Others are doing it through prequalification questionnaires before they'll engage a subcontractor on a project at all.
The 32% failure rate at prequalification is concentrated among subcontractors. The reason is straightforward: subcontractors haven't been exposed to these questions before, and they don't have documentation ready. When a prime sends a prequalification questionnaire asking for an AI tool register and a data residency position, many subs either leave it blank or provide an answer so vague that it raises more questions than it answers.
Primes are also increasingly aware that their own audit exposure extends to their supply chain. If an evaluator asks a prime how it governs AI use among its subcontractors and the answer is "we don't," that's a problem for the prime. So primes are becoming more rigorous about what they require from subs before engagement.
The practical implication: if you're a subcontractor working in Defence, Energy, Infrastructure, or large-scale construction, you need AI governance documentation even if you never respond to a government tender directly. Your prime is going to ask for it — and the contractors who have it ready are the ones who get on the preferred panel and stay there.
Smaller contractors sometimes argue that they use almost no AI tools and therefore the question doesn't apply to them. That's understandable, but it misreads how evaluators interpret a non-response. The absence of an AI governance position doesn't signal that you're not using AI — it signals that you haven't thought about it. In the current environment, that's a governance gap, not an exemption.
What to Do Before Your Next Tender Drops
The window between now and the 10 December 2026 National AI Plan grace period deadline is your opportunity to build a governance position that holds up — not just in written responses, but in audits.
Here's a practical sequence:
Start with the tool register. Talk to your project managers, engineers, and bid teams. Find out what AI tools are actually in use — not what's officially sanctioned, what's actually running. You'll probably find a wider list than you expected. Document it, then make decisions about which tools are approved for ongoing use and under what conditions.
Check your data handling before it becomes a problem. Take your five most-used AI tools and look at where the data goes. Most vendor terms are publicly available. You don't need a lawyer to read a privacy policy and determine whether data is processed in Australia or overseas. Document what you find.
Write the policy, then brief your team. A policy that lives in a folder on the server and that nobody has read is not a governance position — it's a document. Brief the people who need to know it. Keep a record of the briefing. That's the difference between a document and a practice.
Name the accountable person now. Don't wait for a tender to force the question. Decide who in your organisation owns AI governance decisions and tell them they own it. That person should understand what the GfAA essential practices require and be able to speak to your organisation's position without a script.
Align with your primes before they ask. If you're a subcontractor with regular prime relationships, contact your primes and ask what they'll be requiring in the next twelve months. Getting ahead of the requirement builds confidence. Scrambling to respond when a prequalification questionnaire arrives does the opposite.
The contractors who will be well-positioned when the next significant tender drops are not the ones with the most sophisticated AI governance frameworks. They're the ones who have documentation that reflects reality, is owned by a named person, and can survive a direct question from an evaluator who has been briefed to probe.
That's achievable for any contractor willing to spend the time now rather than the night before submission.
Need to review your AI governance position before a tender drops?
James works with prime contractors and subcontractors across heavy industry to build defensible, audit-ready governance documentation. Fixed scope, clear outcomes.
Talk to James about your AI governance position