Navigating the Future of Tech Governance: Takeaways from the AICD Tech Governance Forum 2026 and the Urgent Need for Digital Sustainability
Published on Sustainably Digital | May 16, 2026
On May 12, I had the opportunity to attend the Australian Institute of Company Directors (AICD) Tech Governance Forum 2026. As the pace of technological change continues to accelerate, the forum provided a crucial space for directors, subject-matter experts, and business leaders to unpack the profound governance challenges surrounding artificial intelligence, cybersecurity, and digital transformation.
The overarching theme of the day was clear: governing modern technology is no longer just an IT issue; it is the core of modern business strategy and risk management. Here are my main takeaways from the event, followed by some critical reflections on how these themes intersect with the core mission of Sustainably Digital.
Key Takeaways from the Forum
Technology Fluency is Now a Baseline Requirement The era of delegating technology oversight to a single "tech expert" on the board is over. As highlighted by experts like Lee Hickin and Christine Holman, AI governance failures—such as high-profile chatbot incidents or data breaches—are rarely technology failures; they are structural governance failures. Boards must establish a minimum floor of technology fluency across all directors to meaningfully interrogate decisions and ensure capital allocation discipline.
AI is a People and Change Management Problem A recurring sentiment throughout the forum was that technology doesn't innovate or disrupt on its own—people do. Sarah Carney noted that AI adoption fundamentally disrupts traditional industrial-era organisational structures, meaning that successful AI implementation is 80% about people, culture, and change management, and only 20% about the technology itself. Steve Vamos provocatively suggested that in this era, a CEO's title should essentially be "Change Executive Officer," as driving organisational alignment is their most non-delegable duty.
Cyber Risk is Accelerating and Interconnected The speed of cyberattacks has drastically compressed; what used to allow hours for a response now demands action in seconds. The centralisation of business functions into Software-as-a-Service (SaaS) platforms has inadvertently created massive "honey pots" for attackers, shifting the nature of third-party risk. To combat this, boards were advised to commission comprehensive failure impact analyses of critical systems and to actively participate in incident response exercises with key vendors.
Responsible AI Goes Beyond "Ethics" To avoid AI becoming a localized issue that no one wants to own, governance must shift from vague "ethics" to actionable "responsible AI" frameworks. Microsoft's practical test for AI projects involves asking three simple questions: Could this impact fundamental human rights? Could it affect psychological or physical well-being? Could it impact life opportunities?. Without embedding these guardrails directly into decision-making systems, documented principles are meaningless.
Conclusion: The Missing Link – Governing for Digital Sustainability
While the forum covered essential ground on strategy, security, and human impact, the most pressing takeaway for our Sustainably Digital community is the hidden environmental footprint of our digital acceleration.
As organisations rush to deploy resource-intensive technologies, environmental governance is at risk of being completely overshadowed by the hype. Consider these staggering environmental realities of our AI boom discussed at the forum:
Massive Energy and Carbon Costs: Training a single frontier AI model like GPT-3 generated an estimated 552 tonnes of CO2. Furthermore, the IEA projects that data centre electricity consumption could exceed 1,000 TWh annually by the end of 2026—roughly equivalent to the entire power use of Japan.
The Water Toll: Global AI freshwater use is projected to hit between 1.1 and 1.7 trillion gallons by 2027, with a tool like ChatGPT using roughly 2 litres of water for every 10–50 queries.
E-Waste Accumulation: Generative AI hardware churn is expected to add up to 5 million metric tonnes of e-waste by 2030.
Despite these impacts, a recent UNESCO/Thomson Reuters Foundation study found that 89% of companies fail to assess the environmental impact of their AI systems at all.
What Does This Mean for Boards and Sustainability?
Addressing "Shadow AI" as an Environmental Risk: Uncontrolled "Shadow AI"—where employees use unauthorised AI tools—is not just a data security risk; it represents a significant, untracked carbon and energy footprint that remains invisible to the board.
Scope 3 Emissions and Compliance: Under mandatory climate reporting obligations like AASB S2, cloud-hosted AI workloads must be recognised as a substantial source of Scope 3 emissions. If a board cannot get emissions data from its cloud or AI vendors, that is a glaring supplier governance failure.
Beware of "AI-Washing": Just as the ACCC has cracked down on greenwashing, companies that boast about net-zero targets while simultaneously expanding undisclosed, carbon-intensive AI infrastructure face massive regulatory and reputational risks.
Ultimately, true technology resilience must include physical and climate resilience. As we look to the future, boards must demand transparency from their vendors and begin measuring digital transformation through a new lens—perhaps adopting a Sustainable AI Quotient that tracks business output per unit of energy, water, and carbon.
Digital innovation is not a free pass on our planetary boundaries. If we are to capture the immense opportunities of AI and emerging tech, we must ensure our digital strategies are as sustainable as they are smart.
Link & Resources;
Governing in the age of AI [Lee Hickin, Australian National AI Centre]
Responsible AI in practice: Our first AI Company Data Initiative report [UNESCO and the Thomson Reuters Foundation]
AI Energy & Sustainability Calculator [GreenPixie]
Conference Notes:
No 1 - Lessons from the Chair Governing Modern Tech Businesses.pdf
No 2 - Governing in the Age of AI What Directors Need to Know.pdf
No 3 - Digital Transformation Delivering Measurable Impact.pdf
No 4 - Responsible Technology Ethics Trust and Oversight in the Boardroom.pdf
No 5 - Leading Through Technology Change What Boards Need to Ask.pdf
No 6 - Cyber Tech Resilience & More Governing Risk in an Interconnected World.pdf