Putting Global Catastrophic Risk on the Australian Intelligence Community’s Radar
The Strategist
SKIPPED
Details
- Date Published
- 18 Oct 2023
- Priority Score
- 5
- Australian
- Yes
- Created
- 22 June 2025, 07:12 pm
Description
The 2024 independent review of Australia’s national intelligence community has kicked off. It will focus on the 10 agencies that comprise the NIC and comes at a time of increasing complexity and uncertainty in Australia’s ...
Summary
The article emphasizes the urgent need for the Australian intelligence community to prioritize global catastrophic risks, particularly as technological advancements heighten these threats. It references a national intelligence community review that can serve as a pivotal moment to integrate catastrophic risk assessments into its framework. By highlighting threats such as nuclear war, extreme climate change, and engineered pandemics, the authors argue for legislative changes and strategic initiatives to fortify Australia against such existential risks. This focus is crucial not only for national security but also for coordinating with international allies like the Five Eyes, indicating significant implications for global AI safety and policy frameworks.
Body
SHAREShare to FacebookShare to TwitterShare to LinkedInShare to EmailPrint This PostWith ImagesWithout ImagesThe2024 independent reviewof Australia’s national intelligence community has kicked off. It will focus on the 10 agencies that comprise the NIC and comes at a time of increasing complexity and uncertainty in Australia’s strategic environment.Among the review’s terms of reference is a direction to consider the NIC’s ‘preparedness in the event of regional crisis and conflict’ and whether the NIC is positioned effectively to respond to the evolving security environment.In our view, the review should demand special attention to one particularly complex problem: global catastrophic risk.Global catastrophic risk, and its crueller cousin existential risk, arethreatsthat could cause harm on a horrific scale. Nuclear winter, engineered pandemics, extreme climate change, space weather. Millions—even billions—could be killed. Recently, artificial intelligence experts havecalled outthe extinction risk from AI.In its 2020Global Trendsreport, the US intelligence community warned the country’s new president, Joe Biden, and his administration of these concerns, made increasingly pressing by technological acceleration:Technological advances may increase the number of existential threats; threats that could damage life on a global scale challenge our ability to imagine and comprehend their potential scope and scale, and they require the development of resilient strategies to survive. Technology plays a role in both generating these existential risks and in mitigating them. Anthropomorphic risks include runaway AI, engineered pandemics, nanotechnology weapons, or nuclear war.It’s time for Australia’s intelligence community to be equally proactive.The pathways to global catastrophe might seem unlikely. But that’s the role of intelligence: to identify, analyse and warn about threats to global and national security. Should the NIC assess global catastrophic risk, it will see that the risk is uncomfortably high.Take nuclear. A full-scale nuclear war between, say, the US and Russia could lead to thedeathsof about five billion people within two years. An aggregate of expert estimatesputsthe annual probability of a nuclear war at around 1%. The same figure was produced by arecent quantitative risk assessment. That’s roughly a coin toss’s chance out to 2100.Or take extreme climate change. Surprisingly littleworkhas been done to assess catastrophic climate-change scenarios. The threemostrelevantstudiesput the chance of a catastrophic climate-change event at between 5% and 20%. The environmental, economic and security consequences in a world where warming is significantly higher than the 1.5°C target remain extremely uncertain.Or take engineered pathogens. Advances in biological engineering could increasingly empower less skilled actors to synthesise novel diseases that are both highly lethal and highly infectious. In 2022, a world-leading bioengineering expertassessedthat ‘within a decade, tens of thousands of skilled individuals will be able to access the information required for them to single-handedly cause new pandemics’. Biorisk expertsforecastan 8% chance that a genetically engineered pathogen could kill more than a hundred million people by 2050. AI mightacceleratethose timelines.This scale of risk is falling through the gaps. Australia, unlike many of its peers, doesn’t have a robust national risk assessment and it’s unclear if any NIC agency is giving the risk the attention it deserves. The clock is ticking on all of these threats. In many cases, policy interventions or well-researched response plans could be relatively straightforward and highly effective. The necessary first step is assessment.The intelligence review is the perfect opportunity to put global catastrophic risk on the radar and to correct course.The 2018 legislation that governs the Office of National Intelligence is silent about whether these critical considerations are within its scope. One option is amending the act to explicitly include global catastrophic risk as a focus area. A simple addition tosection 7— which sets out the duties of ONI—along the lines of ‘and global catastrophic risk that would threaten Australia’s survival, security and prosperity’ would ensure that the NIC doesn’t short-change these risks by focusing only on short-term requirements. Equally, a finding or recommendation that these risks are in scope for the NIC could catalyse action.Going further, an extreme global threats mission within ONI could work across the NIC to collect data on, analyse and monitor these risks. A recurring ONI-led national assessment of global catastrophic risk would track the trends and strategic dynamics that could lead to global catastrophe. ONI could also lead a Five Eyes working group, including collaborating with the US Office of the Director of National Intelligence, which is beginning to prepare its next Global Trends report.Regardless of the mechanism, NIC agencies must focus on pathways to, scenarios for and contributors to global catastrophic risk. Immediate priorities should be assessments of the impacts of AI on nuclear stability, AI-enabled cyber weapons, advanced autonomous weapons, geomagnetic storms on critical infrastructure, and engineered pathogens. The Australian Security Intelligence Organisation should track and assess the catastrophic risk emanating from domestic non-state actors and the potential for increasingly available AI and biotechnology tools to boost terrorist capability.Intelligence iscriticalfor helping policymakers navigate the future, which is at increasing risk of a global catastrophe. The NIC must be Australia’s eyes and ears.