Australians for AI Safety
Ways to ActContact PoliticiansContact Media
Open LettersElection Scorecard
Expert-recommended PoliciesAI Safety InstituteAI Act
EventsMediaFAQs

Does Senator David Pocock support AI safety policies?

Senator David Pocock (Independent) represents ACT in the Senate.

Senator Pocock is a signatory to the Statement on Superintelligence that calls for a prohibition on the development of superintelligence, not lifted before there is broad scientific consensus that it will be done safely and controllably, and strong public buy-in. He is also a signatory to the Australians for AI Safety Open Letter calling for the next government to establish an AI Safety Institute and an AI Act with mandatory guardrails on high-risk AI. He has explicitly called for the introduction of legislation for mandatory guardrails for the safe and responsible use of AI. He has also called for the Government to fund a national AI safety centre - expressing his disappointment that it wasn't a key recommendation from the Senate inquiry into AI adoption, despite being raised by expert witnesses. He has also expressed broader views about AI's “extraordinary potential for uses that could enhance the wellbeing of humanity" and the fact that it "poses significant risks to life as we know it and to human civilisation as we know it.”

Key Statements on AI Safety

Loading quotes...
Contact Senator Pocock
Official contact page →
Loading search...
Australians for AI Safety

Stay updated

By subscribing, you accept our Terms and Privacy Policy. Unsubscribe at any time.

Good Ancestors

Australians for AI Safety is organised by Good Ancestors (ABN 23 664 195 484), a charity registered with the Australian Charities and Not-for-profits Commission.

NOTE: By signing our open letters, individuals and organisations endorse only the core text. Footnotes and additional content are prepared by Good Ancestors and may not represent the views of all signatories.

© 2025 Good Ancestors. All rights reserved.

Website content that is electoral matter under the Commonwealth Electoral Act 1918 is authorised by Greg Sadler, Good Ancestors Policy, Melbourne.

Privacy PolicyTerms of ServiceContact UsFAQ
“With the growing uptake of AI, legislation to mandate how AI is used in high-risk settings needs to be an urgent priority for government.”
Parliament of Australia · View source →
“Senator David Pocock expressed his concern that Science and Industry Minister Ed Husic was creating temporary expert advisory bodies but hasn't taken steps to create an enduring AI Safety Institute. After hearing evidence about the funding Canada and the UK provide to their AI Safety Institutes, Pocock said, 'that seems very doable to me'.”
Australian Security Magazine · View source →
“Deepfakes used without consent threaten our democracy and should be banned in the context of elections. Australians deserve to know that the information they receive from parties and politicians is genuine.”
Information Age · View source →
“Now is the time for the government to act to safeguard our democracy and ensure elections are fought and won as a contest of ideas, not on the basis of who can produce the best deepfakes or tell the most convincing lies.”
ABC News · View source →
“Deepfakes [and] generative AI pose a real risk to democracy and we need the government to ban the use of this technology when it comes to elections.”
The Guardian · View source →
1 of 5
Showing quote 1 of 5 from Parliament of Australia

How do the politicians score on AI safety?

Search to see where parties and candidates stand on key AI policy issues