The New Zealand government takes a reactionary approach to data governance and security
Simon Walker, Chief Strategy & Revenue Officer
It’s no secret that New Zealand is a laggard when it comes to regulation setting standards for organisations’ data governance and cyber-security.
The holiday period has been dominated by the news of major data breaches, with the Manage My Health breach exposing the health records of 120,000 New Zealanders.
New Zealand’s laissez-faire approach to regulation will be forced to change in reaction to these and other incidents. We’ll look to regulation in other developed territories for inspiration – expect elements of the EU’s Data Act, the Australian Cybersecurity Act and others to inform our approach.
This regulatory change will drive boards to take this seriously – when there are real penalties, change happens fast. The threat of business interruption has sometimes not been critical enough to inspire action.
Expect this to force the technology organisation’s hand – investment will now be mandated by boards.
Breaking free from CUDA: NVIDIA’s lock in ends
Ian Hayton, Enterprise Architect
AI solutions have historically been developed around a single dominant hardware and compiler ecosystem: NVIDIA and CUDA. This has created strong vendor lock-in and resulted in elevated pricing for both software tools and hardware architectures.
In contrast, the CPU world achieved broad portability decades ago through open compilers such as GCC and LLVM. This same revolution has yet to fully arrive for GPUs and AI accelerators.
Competitive pressure and open-source innovation are eroding the NVIDIA/CUDA moat:
- Maturing open source tools:
- OpenAI’s Triton (now with robust backends for non-NVIDIA hardware like Google TPUs, AWS Trainium and AMD MI series)
- PyTorch’s torch.compile with OpenXLA for hardware-agnostic optimisation
- Projects like Spectral Compute “SCALE” (demonstrating high CUDA API coverage for alternative hardware)
- Initiatives like the Unified Acceleration Foundation (UXL) and Huawei’s open-sourcing of its CANN toolkit for Ascend AI GPUs aim to create viable open alternatives.
- ZLUDA, an open-source project for running unmodified CUDA binaries on non-NVIDIA GPUs (primarily AMD ROCm, with multi-vendor ambitions). It includes offline compilation, improved ML inference support (e.g. llm.c milestones) and expanded development resources, signalling tangible progress toward drop-in compatibility
- Major players like Google and Meta are advancing efforts (e.g. “TorchTPU” initiatives) to make PyTorch run natively on TPUs and other accelerators, reducing reliance on CUDA for training and inference
- Hyperscalers continue developing custom silicon (TPUs, Trainium, Maia), with open-source compilers helping bridge the software gap and enabling more cost-competitive inference
2026 will be the year this walled garden begins to break down in earnest, enabling more cost-effective, multi-vendor solutions and triggering significant market realignment.
Geopolitics finally forces businesses to consider data sovereignty
Simon Walker, Chief Strategy & Revenue Officer
Ernst Du Toit, Technology Lead
The world is less stable than it’s been at any point in the post-WW2 era in 2026. Existing conflicts such as the war in Ukraine have been joined by the instability in Iran, the prospect of conflict over semiconductors and sovereignty in Taiwan, and the American interventions in Venezuela and (potentially) Greenland.
Trust in the global players is at all-time lows, and when geopolitics creates risk it drives business action.
Paying perpetual rent for your core business to operate is a tough enough position. When this is coupled with availability risk driven by geopolitical uncertainty, it creates an untenable situation.
The EU is already seeing a major repatriation of workloads and data from the US and US companies. Expect the same in 2026 in APAC as regional tensions ramp up.
Compute costs spiral out of control as rising memory prices bite
Carl Black, Sourcing Manager
The technology industry is experiencing a sharp upward trend in memory pricing across all categories, DDR4, DDR5, NAND flash, and HBM, driven by an unprecedented collision of the AI boom demand and extreme supply chain constraints. This pricing surge reflects a fundamental market shift, where storage manufacturers are prioritising high-margin datacentre and AI applications while simultaneously facing dual demand from both legacy DDR4 infrastructure and new DDR5 deployments.
The trend shows no signs of slowing, as suppliers have signalled their 2026 inventory has been accounted for and their intention to maintain pricing stability through production management rather than allowing prices to fall. This is creating a sustained cost escalation environment for hardware procurement over the next 12-18 months. Its best to budget for higher memory costs now.
Strategic Procurement planning isn't optional anymore, it's essential for maintaining costs. If you're planning infrastructure upgrades or new deployments, factor in these elevated costs from the start rather than being caught off guard. Read more about it here.
William Jevons is right and AI drives employment growth
Simon Walker, Chief Strategy & Revenue Officer
In 1865, Wiliam Stanley Jevons observed that the more efficient Watt Steam Engine caused an increase in coal consumption as people discovered new tasks that had now become economic to perform. Increased efficiency actually increasing consumption has since been known as the Jevons Paradox – and we’re going to begin to see it with AI in 2026.
We have evidence of the Jevons Paradox in labour markets as well. 40% of Americans were employed in agriculture in 1900 – today the number is more like 2%. Yet, as food production efficiency has increased, employment opportunities that were previously impossible without a huge agriculture labour force became possible. US unemployment is 4% today – down from 5% in 1900.
Increased labour productivity with AI will actually increase demand for labour. Current jobs will become less relevant, of course – but overall, we’ll see an employment boom. For example, developers (ostensibly under threat as LLMs produce better and better code) will be able to do more faster – increasing demand as jobs that previously couldn’t justify the expense of custom development now become affordable.
Check out this great article exploring the paradox in more detail, along with how it might apply to AI.
New Zealand doesn’t address the most important commodity of the next 50 years: electricity
Simon Walker, Chief Strategy & Revenue Officer
The Electricity Authority’s generation investment pipeline at first glance looks healthy. It tracks 289 potential generation projects with a combined 44.29 gigawatts of capacity. There are two problems with this approach:
- It looks great based on our current power demand – but it doesn’t consider the potential future demand from high-performance compute
- The bulk of the capacity is wind and solar – these provide intermittent power (when the wind is blowing or the sun is shining), but data centres need to operate 24/7
Globally, there has been an increase of 240% of announced new data centres, but the number built is a fraction of the pipeline. A lack of electricity generation resource is holding everything up.
New generation is slow to build and capital intensive. It’s going to take 10 to 15 years to bring capacity online – so we need to start today. Regardless of what source we use, we will need significantly more electricity in 2040 than we have planned for.
This is a safe prediction – our struggles with big projects over the last 25 years show that it’s highly likely that this remains unaddressed in 2026. It should be much higher on any government’s agenda than it actually is.
If you're re-thinking your 2026 plans...
Don't let 2026's technology shifts catch you off guard. If you're facing escalating memory costs, evaluating AI investments, or navigating data sovereignty requirements, our team of expert advisors can help you build a strategy that's resilient, cost-effective, and aligned with what you need.