Published: 21 April 2026
The Minister of Justice has asked the Law Commission to undertake a review of legal issues related to the use of automated decision-making by government.
‘Automated decision-making’ refers to the use of digital technologies, including artificial intelligence, to assist decision-making processes. At present, New Zealand does not have overarching law, standards or guidance specifically addressing how government agencies should use automated decision-making in a legally compliant and consistent manner.
Tumu Whakarae | President Dr Mark Hickford says the absence of a clear framework presents challenges:
“Currently there can be a siloed approach among government agencies to identifying and mitigating legal risks when automated decision-making is proposed.
This can result in a range of issues, including inefficiencies in implementing automated decision-making, inconsistencies between departments, risks of poor practice and Crown liability, and unnecessary risk aversion that can discourage responsible innovation.”
Dr Hickford says the Commission’s work will focus on creating a coherent legal framework to guide government agencies.
“A clear and consistent framework for identifying and mitigating legal risk will give agencies greater confidence to invest in and use automated decision-making tools, particularly those that rely on artificial intelligence, while ensuring decisions remain lawful, transparent and fair.”
It is expected that the Commission will begin this project in approximately the middle of this year.