Skip to content
Media releases

New report delivers expert governance advice for Australian businesses in response to Voluntary AI Safety Standard

Australian businesses are being urged to reduce AI system biases and have a better understanding of copyright law in response to the Federal Government’s recently released Voluntary AI Safety Standard.

11/9/2024

Australian businesses are being urged to reduce AI system biases and have a better understanding of copyright law in response to the Federal Government’s recently released Voluntary AI Safety Standard. 

The new white paper from the Governance Institute of Australia and the National AI Centre, aligns the 10 voluntary guardrails promoting safe, responsible and transparent AI with best-practice governance guidance from industry and experts. 

Sponsored by Clayton Utz, Diligent and PKF, the White Paper on AI Governance found Australian businesses are not taking full advantage of AI, with confidence and adoption rates lagging comparable countries.   

Based on three specialist roundtable consultations, the report also found ethical challenges are leading to missed opportunities for innovation and efficiency.   

Chief Executive of the Governance Institute of Australia, Megan Motto FGIA FCG, said the report provides an extensive overview of how businesses of all sizes and from all sectors can harness the technology to drive growth, enhance efficiency and create value.  

“Understanding and adopting AI is essential for staying competitive in today’s market,” Ms Motto said.  

“Through this report, business leaders will gain valuable insights and practical strategies to harness the power of AI, ensuring their organisations remain at the forefront of innovation.”  

Stela Solar, Director of the National Artificial Intelligence Centre (NAIC), said working closely with industry partners is crucial to alert senior business leaders to the influence they wield in adopting safe and responsible AI practices within their organisations.  

“Directors’ responsibilities have always been shaped and influenced by technology,” Ms Solar said.   

“However, the speed and scale of AI development and deployment requires them to lean-in, to learn and lead their organisation towards the potential of AI, whilst increasing efforts to mitigate against new and emerging risks.”  

Key expert tips include:  

  • Continuously improve AI systems to reduce biases and ensure fair outcomes for all users 
  • Seek advice from regulators on copyright and deepfakes 
  • Include legal, ethical, and commercial considerations for AI in the design of AI procurement and contracts 
  • Leverage AI for better digital asset security 
  • Prioritise robust privacy and backup systems during AI implementation 

Ms Motto said the report examines both the opportunities and challenges of working with AI, advocating for a balanced approach with sound ethical principles in mind.  

“Good governance is essential to ensure that AI technologies are developed and used in ways that are transparent, fair, and aligned with societal values,” she said.  

“Our partnership with NAIC underscores our commitment to not only advancing AI but doing so in a manner that upholds the highest standards of integrity and accountability.”  

For more information or to request interviews, contact media@governanceinstitute.com.au 

Supporting partner 

NAIC

Proudly sponsored by 

About the Governance Institute of Australia  

A national membership association, Governance Institute of Australia advocates for a community of governance and risk management professionals, equipping over 8,000 members with the tools to drive better governance within their organisation. 

Good governance essentials: Building a foundation for exceptional performance

Next article