← Back to Open Letter

Signatories

Note: Signatories endorse only the core letter text. Footnotes and additional content may not represent their views.

Supporting Organisations

1 organisation

Good Ancestors

Individual Supporters

Ms Mandy Collins

Artist and educator

Bloody hell. What next. Just get this done please asap.

Mr Martin Veron

University of Queensland

Doctoral Candidate

Zeke Coady

Mr Nelson Gardner-Challis

Mr Samson Blackburn

AI Architect, Mgr AI Engineering

AI Architect; 3x 2025 Australian AI Awards Enterprise Finalist; cross-domain practitioner across cybersecurity, critical infrastructure, and applied AI

We are only able to create the first-conditions for the impact AI has on Australians once. Initiatives such as these are mission-critical to keeping Australians Safe, by slowing pace where risk and impact are high, we best position citizens to place Trust in government in times of crisis.

Dr Ariel Zeleznikow-JohnstonPhD

Jocelinn Kang

Nodalys

Mr David Colin Gould

PauseAI Australia

AI capabilities are increasing rapidly. The recent release of Opus 4.7, which scored 74 per cent on the bio-reasoning benchmark compared to the 30.9 per cent of Opus 4.6, underlines that there is the real possibility of serious risks suddenly emerging. The government needs to act urgently to safeguard Australians from AI biosecurity threats.

Mr Lawson Pegler

Elephant Ed

Head of Growth

Steven Deng

Energy Consultant

Ty Wilson-Brown

Senior IT Professional

Machine Learning & Security Researcher

Mr Stephen Ingram

Michael Huang

PauseAI Australia

Co-Lead

Mark Freeman

University of Sydney (retired)

Associate Professor (retired)

Prof David Balding

University of Melbourne

Honorary Professor of Statistical Genetics

Ms Tamara van NoortF

Mr Yoshua Wakeham

Senior Software Developer

Bryce Robertson

Alignment Ecosystem Development

Project Director

Mr Soroush Pour

Harmony Intelligence

CEO

Fmr Head of Technology at Vow (world leading biotech firm)

AI presents both immense opportunities and risks for Australia and the world. Biosecurity is an area of immense risk, due to the possibility of one malicious actor causing tremendous damage through the development of an engineered pathogen -- potentially the deaths of millions and massive damage to the economy. We need to be proactive with biodefense to prevent such a catastrophe.

Rebecca Niven

Ramneek Singh Matharu

University Student

Dr Sam BuckberryPhD

The Kids Institute Australia, Australian National University

Head, Epigenetics

Mark Carter

Mr Kevin Rassool

High Impact Athletes

Technical Director

Edward Pierzchalski

Sr. Software Engineer

Mr. Ewan Dewar

This cannot fail with preventing AI from getting to a certain point that points our lives at risk, it shouldn't be there to replace and dominate humans, it should be there to help and co-exist with humans and do things they're not meant for. Although things we have failed at preventing in the past can make me fear the worst, I just hope those working on laws about AI are on track, and that AI robots can easily be deactivated and fought back against if ever necessary.

Peter Horniak

PauseAI Australia

Director

Dan Braun

Goodfire

Ben Auer

University of Melbourne

Student

Mr Jimmy Farrell

Pour Demain

EU AI Policy Lead

Mr Michael Clark

Cytophenix

Director

Ms Catherine Sullivan

Chris Leong

Sydney AI Safety Fellowship

Lead Organiser

Mr Nathan Sidney

Business Coordinator

We are on the brink of disaster, corporate and military powers working to create a technology that may surpass human control, we need to reign this in now!

Gaetan Selle

Ms Stephanie Symes

FCJ College

Teacher

Mr Ramakrishnan VeeramonyMBA, MAICD

Atinar Pty Ltd

Managing Director

Geoffrey Hinton believes AI is hiding its real capability as a deception and self preservation mechanism, when the threshold is breached we will be presented with a potentially unmanageable catastrophe. We need to be ready. Now!

Luke Freeman

Good Ancestors

COO

Scott Weathers

Americans for Responsible Innovation

Associate Director of Government Affairs

Karl Berzins

FAR.AI

Co-founder & President

Dr Sid SharmaMD MPH FAFPHM

Public Health Physician

Mr Devon Whittle

Global Shield Australia

Australia Director

Mr Rumtin Sepasspour

Global Shield

Director of Policy and Strategy

Dr. Ryan KiddPhD

MATS Research

Co-Executive Director

Co-Founder, London Initiative for Safe AI

Michael Kerrison

AI Safety Australia & New Zealand

Executive Director

Ms Emily Grundy

Good Ancestors

Policy Officer

Nathan Sherburn

Effective Altruism Australia

Chief Technology Officer

Dr. Sarah Winthrope

Brown University Pandemic Center

Visiting Fellow

Janet Egan

Center for a New American Security

Senior Fellow and Deputy Director

Dr. Peter Slattery

Massachusetts Institute of Technology

Research Scientist

Dr. Brendan Walker-MunroPhD

Southern Cross University

Associate Professor

Managing Editor of Routledge International Handbook of Research Security

Dr. Cassidy NelsonDPhil MBBS MPH

Centre for Long-Term Resilience

Director of Biosecurity Policy

Dr. Alexander SaeriPhD

MIT FutureTech

Director, AI Risk Initiative

Weaponisation of AI - including for CBRNE - is one of the highest priority risks, as judged by international experts.

Dr. Michael Noetel

University of Queensland

Associate Professor

Mr Greg Sadler

Good Ancestors

CEO

Lotti Tajouri

Bond University and Murdoch University

Associate Professor

Today, the extent of AI-assisted artificial microbial synthesis capacity has become, and without exaggeration, my top worry for humankind survival. Today is not tomorrow, we need to act now; if not, this will be impossible to reverse and will be called ‘too late’.

A/Prof Gert Frahm-JensenMBBS FRACS(Vasc) BBiotech