I've sat in a lot of DDA audit prep meetings over the past few years. They tend to follow a predictable pattern.
Someone from the agency's digital team pulls up a spreadsheet. There are hundreds of rows. Missing alt text, empty link text, contrast failures, heading order problems. The room goes quiet. Then someone says, "How long do we have?" And someone else says, "Six weeks." And then the fun really begins.
What I've noticed lately, though, is that a significant chunk of those failures can now be caught and fixed before they make it into the audit spreadsheet. Not because of some new third-party tool or a consulting engagement. Because of features that Microsoft quietly shipped inside SharePoint and M365 over the last twelve months or so, features that most government content teams have no idea are there.
I want to walk through those tools specifically: what each one does, where to find it, when it genuinely helps, and where it hits its limits. Because there are real limits. Anyone who tells you AI will solve your accessibility compliance on its own is either selling something or hasn't actually sat with a screen reader user going through a government website.
But as a starting point for the most common categories of failure? These tools are genuinely useful. And they're already in your licence.

You're Probably Paying for SharePoint AI You've Never Switched On
Most Australian organisations with M365 licences have AI capabilities sitting dormant in SharePoint. Here's what's available, what it costs, and how...
Read full articleThe Stats That Haven't Changed (And That's the Problem)
Before we get into the tools, here's the thing that keeps me coming back to accessibility work.
The WebAIM Million Report 2025 tested the top one million home pages for detectable WCAG 2 failures. The result? 94.8% had at least one automatically detectable failure. The average page had 51 errors (WebAIM, 2025). And here's the part that genuinely stings: the six most common failure types have been essentially unchanged for five years straight.
Low contrast text. Missing alt text for images. Missing form input labels. Empty links. Empty buttons. Missing document language. Those six categories account for about 96% of all detected errors.
I bring this up because it's both discouraging and, in a weird way, hopeful. Discouraging because we've known about these exact failure patterns for years and still haven't fixed them at scale. Hopeful because they're the categories where AI assistance is most practical. These aren't nuanced cognitive accessibility judgements. They're detectable, repeatable, fixable.
And from 1 January 2026, every existing Commonwealth digital service is required to meet WCAG 2.2 Level AA (new services were already required from 1 January 2025) (DTA Accessibility Guidance, 2026). The Digital Service Standard Criterion 9 requires it. The DDA 1992 requires accessible digital services as a baseline. If your agency has a significant content library in SharePoint and you're not using the tools I'm about to describe, you're doing remediation the hard way.
Tool 1: The SharePoint Pages Authoring Accessibility Assistant
This is the one most government teams don't know about. It's built directly into the modern SharePoint page editor, no add-on required, no extra licence, no configuration needed. Microsoft calls it the SharePoint Pages Authoring Accessibility Assistant (Microsoft Support, 2025).
What does it actually check? Heading structure and order (so if someone's jumped from an H2 straight to an H4, it'll flag it). Missing alt text on images. Link purpose (empty or ambiguous link text like "click here" or "read more"). Colour contrast on editable elements. Table structure. And it does this in real time, with suggested fixes, while the content author is editing the page.
That last part matters more than it might sound. The traditional accessibility workflow is: content author publishes, automated tool scans it later, finds the errors, tickets go back to the content team, fix, republish. That loop can take weeks. The Accessibility Assistant breaks that cycle by surfacing the issues before publication, when they're cheapest and easiest to fix.
I'll be honest about the limitations, though. It only covers native SharePoint web parts. If your team has built custom web parts or embedded third-party content (iframes, external widgets, that kind of thing), the Checker won't touch those. You'll still need manual testing and developer-level review for anything non-standard.
For a team managing hundreds or thousands of SharePoint pages with standard content, though, this should be part of every publish workflow.
Tool 2: Automatic Alt Text for Images
When you upload an image to SharePoint Online, M365 AI generates a suggested alt text description automatically. You can accept it, edit it, or override it entirely in the image web part properties. The basic version is available in standard M365 without any Copilot add-on. If you've got Copilot licences or a Copilot+ PC, you get more detailed and accurate descriptions (Microsoft Support, 2025).
The quality reality: it's genuinely good for standard photography. People, objects, scenes, landscapes (there, I said "landscapes" once and now I'm never saying it again). For that kind of content, the auto-generated descriptions are often accurate enough to publish with a light review.
It's not good enough on its own for:
- Charts and infographics (it'll describe what's visually there but won't capture the data meaning)
- Images with text embedded in them (it might miss the text entirely, or transcribe it without context)
- Complex diagrams, maps, or technical illustrations
- Decorative images that should have empty alt text (the AI tends to want to describe everything)
For a government content library where you're dealing with policy documents, procedural guides, and public information pages, there'll be a lot of charts and infographics. Those need human-written alt text. But for the hundreds of images that are just photos of people, buildings, and events? Auto alt text handles those well and it means your content authors spend their review time on the cases that actually need judgement.
Tool 3: Copilot Plain Language Rewriting
This one does require the Copilot for Microsoft 365 add-on, which runs around AUD $45 per user per month on top of your M365 licence (verify current pricing at microsoft.com/en-au; government agreement rates may differ). I want to be upfront about that because the cost matters for government teams working with constrained budgets.
What you get: "Rewrite with Copilot" in the SharePoint rich text editor, which includes auto-rewrite, tone adjustment, length adjustment, and plain language simplification (Microsoft Support, 2025).
The plain language piece is directly relevant to DDA compliance and DTA guidance. Australian government agencies have obligations under the Digital Service Standard to make content readable and understandable by a broad public audience. The WCAG 2.2 Guideline 3.1 addresses readable content. A lot of government web content fails this criterion not because of technical errors, but because policy language gets copy-pasted directly from ministerial briefs into public-facing pages.
I've seen Copilot's rewriting genuinely improve readability on dense policy text. It works best when you give it clear direction ("simplify this to a Year 10 reading level" or "rewrite this for a general audience unfamiliar with government process"). The output still needs human review, though, and I'd say that firmly. An AI rewrite can lose critical nuance in technical or legal content, and in a government context, publishing something that's easier to read but technically imprecise can cause real problems. This is a drafting tool, not a publishing button.
If your agency is deciding whether the Copilot licence cost is justified, the accessibility use case is one genuine argument in its favour. But the ROI calculation only works if you've got content authors who'll actually use the feature as part of their workflow, and that requires training and process change alongside the licence purchase.
Tool 4: Microsoft Editor (No Extra Licence Needed)
This one's already in Word and Outlook if you're on M365 Business Premium, E3, or E5. No extra licence required.
Microsoft Editor flags passive voice, complex language, jargon, and gendered terms. It's not an accessibility tool in the WCAG sense, but it addresses the cognitive accessibility dimension that often gets overlooked in formal compliance programmes. WCAG 2.2's Success Criterion 3.1.5 (Reading Level) calls for content that doesn't require more than a lower secondary education reading level, or for a supplementary version to be provided. Editor helps writers catch patterns that push readability scores up: long sentences, abstract nouns, passive constructions.
For government teams who don't have Copilot licences and want to improve content readability without additional cost, Editor is the practical answer. It's not going to transform a 4,000-word policy brief into plain English on its own, but it surfaces the most common issues in real time as someone's writing.
Tool 5: SharePoint Premium Document Processing
SharePoint Premium (previously called Syntex) includes AI-powered document processing: classification, metadata extraction, content tagging, and OCR for image-based content. It's available on a pay-as-you-go model, which is useful for government teams that don't want to commit to per-user licensing for an occasional-use feature.
Where it helps with accessibility: if you've got a large library of uploaded PDFs that need to be classified and triaged for remediation, Premium's document processing can help identify document types, extract structural information, and surface content that's likely to have accessibility issues (scanned documents with no text layer, for example).
Where it doesn't help: it won't produce a fully tagged, screen-reader-accessible PDF. Full PDF remediation still requires Adobe Acrobat Pro or a dedicated PDF accessibility tool like PDF Accessibility Checker (PAC 2024). What Premium does is help you manage and prioritise the remediation backlog, which is genuinely valuable if you're dealing with thousands of legacy documents and need to work out which ones to tackle first.
The Government AI Blocker That Just Got Fixed
Here's the section I've been building to. Because there's a feature that went generally available at Microsoft Ignite in late 2025 that I think is genuinely significant for Australian government M365 adoption, and I haven't heard many people talking about it specifically in this context.
It's called DLP for Copilot Prompts, and it's in Microsoft Purview (Microsoft TechCommunity, Nov/Dec 2025).
The single biggest objection I hear from government IT and security teams when Copilot comes up is: "What happens when a user puts sensitive information in a prompt?" It's a legitimate concern. If someone asks Copilot "Can you summarise this PROTECTED health record and draft a response?" you've potentially got AI processing content that shouldn't be going anywhere near a general-purpose AI system.
Purview DLP for Copilot Prompts addresses this directly. If a user's prompt contains content that triggers your DLP policies, Copilot doesn't respond. The same policy framework that governs email and SharePoint document handling now extends to Copilot prompts across M365 Copilot and Copilot Chat.
For agencies that are already using Microsoft Purview with the Australian Government Information Security Manual (ISM) sensitivity labels (OFFICIAL, SENSITIVE, PROTECTED), those policies extend automatically to Copilot prompts. You don't need to build a separate AI governance layer. The one you've already invested in applies.
I'd reframe this for any agency that's been sitting on the fence about M365 Copilot for data-handling reasons. The question used to be "Is it safe to use AI with our data?" The question now is: "Are our Purview DLP policies correctly configured?" That's a different conversation, and it's one most security teams already know how to have.
For the plain language and accessibility use cases I described earlier, this means government content teams working with sensitive policy documents can use Copilot's rewriting and simplification features without the same risk profile that existed twelve months ago. That's significant.
The Australian Compliance Picture
Let me bring the compliance obligations together, because I know different readers will be at different points in their WCAG journey.
Under the DDA 1992, it's unlawful for Commonwealth agencies and most state government bodies to provide digital services that discriminate against people with disability. That's been the baseline for years.
The DTA's Digital Service Standard, Criterion 9 (Make it accessible), requires that government services meet WCAG 2.2 Level AA (DTA Digital Service Standard, 2025). The updated timeline means new services were required to meet WCAG 2.2 from 1 January 2025, and all existing services from 1 January 2026. If you're reading this in late February 2026, that second deadline has passed.
WCAG 2.2 builds on 2.1 with additional success criteria, particularly around cognitive accessibility and mobile. The full standard is at W3C, 2023.
Microsoft publishes VPAT accessibility conformance reports for SharePoint Online and other M365 products at Microsoft Accessibility, 2025. These are useful for procurement purposes and for demonstrating due diligence to auditors, though they document Microsoft's compliance, not yours. Your content still needs to meet the standard.
One more practical note: Australian data residency for M365 is handled through Microsoft's Azure data centres in Sydney and Melbourne. Specific Azure services in those regions appear on the ASD Certified Cloud Services List; agencies with PROTECTED-level data handling requirements should verify M365 Copilot's specific status against their own security risk framework before deployment. That's the first question to resolve before any of the tool-specific decisions.
What AI Still Can't Do (And Won't, Any Time Soon)
I want to be direct about this because I've seen too many teams get sold the idea that automated tools plus AI equals compliance. It doesn't.
Here's what you still need human judgement and proper testing for:
Keyboard navigation. The SharePoint Accessibility Assistant doesn't test whether your page can be used without a mouse. That requires automated tools like Deque Axe or manual keyboard testing, and then developer work to fix the underlying issues in templates or custom code.
Screen reader compatibility. Auto alt text tells you what's visually in an image. It doesn't tell you whether that description is meaningful to someone who's navigating the page with a screen reader for a specific purpose. A photo captioned "three people in a meeting room" might be perfectly accurate and entirely useless in context. Human review against actual user need is irreplaceable here.
Colour contrast in images. The Accessibility Assistant can flag contrast issues in editable text elements. It can't fix images that have low-contrast text baked in as part of the image file. Those need to be redesigned or replaced.
Custom web parts and third-party integrations. If your SharePoint environment has custom-developed components, embedded external widgets, or integrated line-of-business applications, the built-in tools won't test those. Accessibility compliance for custom development requires developer-level review and testing with assistive technology.
Complex data tables. AI-generated table markup needs validation. Screen reader users navigate tables using structural cues (header associations, scope attributes, caption elements) that require careful human construction, especially for multi-level or merged-cell tables.
ARIA landmark regions and skip navigation. This is template-level work. Your SharePoint master page and page layouts need to include proper landmark regions and skip navigation links. That's a development task, not a content task, and no content-level AI tool touches it.
Cognitive accessibility beyond reading level. WCAG 2.2 includes success criteria around timing, predictability, and consistent navigation. These require design and UX judgement, not just text analysis.
I say all of this not to diminish the tools I've described, but to set realistic expectations. The M365 AI features are genuinely useful for catching the most common, most detectable, most preventable failures. They're not a substitute for a proper accessibility programme.
A Practical Workflow for Government Content Teams
So how do you actually integrate these tools into ongoing content governance rather than treating them as a one-time audit fix?
Start with your most-visited pages. Run the SharePoint Pages Authoring Accessibility Assistant on your top 20 most-visited pages today. You don't need a project plan for this. Open the pages in edit mode, the Checker is already there. Fix what it surfaces. You'll probably find a cluster of heading structure issues and missing alt text that you can clear in a day or two.
Enable auto alt text for all new uploads. This should be the default behaviour anyway, but confirm it's turned on in your SharePoint environment. For new content, require content authors to review and confirm the suggested alt text before publishing. Build this into your publishing checklist, not as an optional step.
Set a quarterly accessibility review cadence. Every page published in the previous quarter gets run through the Accessibility Assistant before the end of the following quarter. This isn't a comprehensive audit, but it means problems don't age for years before someone looks at them.
Use Copilot plain language rewrites for public-facing policy documents if you have the licence. Route the output through a human reviewer (a comms team member or experienced content strategist) before publishing. The reviewer's job is to check technical accuracy and policy alignment, not to proof-read grammar. The AI handles the readability lift; the human catches the nuance failures.
For legacy PDFs, triage with SharePoint Premium first. If you've got hundreds or thousands of uploaded PDFs, use document processing to classify and prioritise them. Focus professional remediation effort on the high-traffic, high-risk documents. Scanned image PDFs with no text layer are top priority. Word-originated PDFs with clean structure are lower priority.
For Copilot adoption decisions, check your Purview DLP configuration first. If your agency has M365 E5 or Purview add-on licensing and has implemented information barriers and sensitivity labels, the DLP for Copilot Prompts protection is available to you. Verify the policies are in place and tested before you roll Copilot out to content teams working with sensitive material. Once that's confirmed, the prompt protection is automatic and you can have a more confident conversation with your security team about broader adoption.
The Bottom Line
Government digital teams are under-resourced and under continuous compliance pressure. I've seen that first-hand across enough agencies to know it's not a specific organisational failing; it's the structural reality of public sector digital work.
What I'm describing here isn't a magic solution. It's a set of tools that are already in your existing Microsoft 365 investment, that your content authors can use without specialised training, that address the most common and most preventable categories of WCAG failure, and that now include data governance protections that remove one of the main objections to AI adoption in government environments.
That's not nothing. And given that the WCAG 2.2 transition date for existing Commonwealth services has now passed, it's worth knowing about before your next audit call.
I'm still figuring out where these tools fit in longer-term accessibility programme design. The honest answer is that the AI capability is moving faster than the governance frameworks around it, which is true in a lot of areas right now. But for the specific problem of reducing your most common, most detectable content-level accessibility failures before an auditor finds them? The tools are there. You just have to know where to look.
Sources
- Microsoft. "Use the SharePoint pages authoring accessibility assistant." Microsoft Support, 2025. https://support.microsoft.com/en-us/office/use-the-sharepoint-pages-authoring-accessibility-assistant-301eb800-594a-401b-bd4a-792e7a2a1b5a
- Microsoft. "Accessibility features in SharePoint Online." Microsoft Support, 2025. https://support.microsoft.com/en-us/office/accessibility-features-in-sharepoint-online-f291404a-dc7e-44de-a31f-d81b3099c2b9
- Microsoft. "Add accessible pictures and media to a SharePoint Online site." Microsoft Support, 2025. https://support.microsoft.com/en-us/office/add-accessible-pictures-and-media-to-a-sharepoint-online-site-a17dc07d-1184-49d1-adc4-87841644cda4
- Microsoft. "Write with Copilot in SharePoint rich text editor." Microsoft Support, 2025. https://support.microsoft.com/en-us/topic/write-with-copilot-in-sharepoint-rich-text-editor-2e32d386-5b0b-4f7e-9a69-8ae96a294647
- Microsoft TechCommunity. "What's New in Microsoft 365 Copilot: November/December 2025." Microsoft 365 Copilot Blog, 2025. https://techcommunity.microsoft.com/blog/microsoft365copilotblog/what%E2%80%99s-new-in-microsoft-365-copilot--november--december-2025/4469738
- Digital Transformation Agency. "Make it accessible: Digital Service Standard Criterion 9." DTA, 2025. https://www.dta.gov.au/help-and-advice/digital-service-standard/digital-service-standard-criteria/9-make-it-accessible
- Digital Transformation Agency. "Accessibility guidance (WCAG 2.2)." DTA, 2026. https://www.dta.gov.au/accessibility
- W3C. "Web Content Accessibility Guidelines (WCAG) 2.2." W3C Recommendation, 2023. https://www.w3.org/TR/WCAG22/
- WebAIM. "The WebAIM Million 2025 Report." WebAIM, 2025. https://webaim.org/projects/million/
- Microsoft. "Accessibility conformance reports (VPATs)." Microsoft Accessibility, 2025. https://www.microsoft.com/en-us/accessibility/conformance-reports
