STAGING
Giving from your Founders Pledge DAF this year-end? Check our 2024 giving deadlines
Header visualHeader visualHeader visual

Innovation brings risk
Some of the greatest threats we face come from advances in biotechnology and advanced AI systems.

Global Catastrophic Risks Fund

Donate Now

Our objective

Stop the next global catastrophe in its tracks

We live in an era of new perils.

Humanity faces existential risks, including war between great powers, natural and engineered pandemics, thermonuclear war, threats from advanced artificial intelligence (AI), and frontier military technologies.

These global catastrophic risks have the potential to kill hundreds of millions, even billions, of people alive today.

We can come together – scientists, policymakers, engineers, military leaders, and motivated citizens – to mitigate these risks. It's happened before. During the Cold War, political leaders negotiated to reduce stockpiles of weapons of mass destruction. At the turn of the millennium, scientists tracked large asteroids and comets in Earth’s vicinity. Today, countries are working together on global preparedness for the next pandemic disease.

The Global Catastrophic Risks Fund (GCRF) tackles far-future threats and takes action now to help protect every human being alive today. We aim to:

  • Reduce the probability of large-scale catastrophic events;
  • Mitigate the potential negative impacts of these events if they occur;
  • Improve the ability to anticipate new and emerging risks on the horizon.

Want to tackle climate change? We have an entire Fund dedicated to it.

The Global Catastrophic Risks Fund is a philanthropic co-funding vehicle that does not provide investment returns.

Illustrative image

▲ Photo by Cash Macanaya on Unsplash

Our strategy

We find opportunities to support highly impactful and neglected initiatives to reduce the probability of worldwide catastrophes and mitigate their consequences. This is a complex mission, with an ever-changing threat landscape. We give special consideration to threats that could curtail humanity’s future, leaning towards tractable solutions today. By seeking opportunities that are neglected by other grant-makers, we can ensure that the Fund is as high-leverage as possible.

Grant-making

Our decision-making is guided by three core values: impact, innovation, and flexibility.

  • To maximize impact, our grant portfolio includes both direct interventions, like funding the development of new crisis communications technology or personal protective equipment (PPE), as well as research and hits-based bets. Hits-based bets are initiatives where success is less certain, but where there is potential to improve many more lives if successful.
  • We are committed to innovation, including developing new and better approaches to grantmaking, and providing seed funding for novel projects.
  • We maintain flexibility to respond rapidly to emerging crises and windows of opportunity. We work with networks of domain experts, trusted partners, and government decision-makers to identify new opportunities, and deploy funds in the most effective ways.

When evaluating potential grants, we consider several factors:

  • Counterfactual impact.
  • Collaborating with trusted partners.
  • Avoiding harm and information hazards.
  • Filling funding gaps.
  • Organizational strength.
  • Seizing time-sensitive opportunities and policy windows.

We accept and review requests for funding. Learn more and apply here.

Illustrative image

▲ Photo by CDC on Unsplash

Direct and co-funded grants

Date
Recipient
Grant
Amount
October 2024

Supporting their work developing technologies and policies that protect the world against extreme biological risks.

$10,000
October 2024

Incubating policy research project on AI crisis preparedness, with an aim to improve how the U.S. government detects and responds to advanced AI threats.

$157,000
August 2024

Establishing the Conclave on Great Powers and Extreme Risks, a twice-yearly coordinating forum on global catastrophic and existential risks for key policy stakeholders and organizers of track 1.5 and track 2 dialogues between great powers.

$429,000

Advised grants

These grants have been identified, evaluated and advised on by Fund Managers; resources were deployed by external philanthropists through their giving infrastructure, separately from the Fund.

Date
Recipient
Grant
Amount
January 2024
Carnegie Endowment for International Peace

To launch Project "Averting Armageddon"

$2,504,000
October 2023

Seed funding to launch the organization

$3,000,000
September 2023
Pacific Forum

For the U.S.-China Strategic Nuclear Dialogues

$200,000

Prevent the most severe global catastrophes

Donate Now

Founders Pledge members

Contribute through your Donor Advised Fund (DAF) easily through the member app. Don’t have a DAF or want to discuss your options? Reach out to giving@founderspledge.com.

Donate on the member app

Not a member?

Contribute through Every.org or Giving What We Can. You can also contribute from any Donor Advised Fund; for details, reach out to giving@founderspledge.com.

Donate on Every.orgDonate on Giving What We CanDonate on Charityvest

Our impact

Impact reports

2023 Impact Report

Meet the Fund Manager

Portrait

Christian Ruhl

Christian Ruhl is a Senior Researcher at Founders Pledge. Christian’s work focuses on understanding, forecasting, and mitigating global catastrophic risks, including risks from great power conflict and weapons of mass destruction.

Previously, Christian managed the program on “The Future of the Global Order: Power, Technology, and Governance” at Perry World House, the University of Pennsylvania’s global affairs think tank. After receiving his BA from Williams College, he studied on a Dr. Herchel Smith Fellowship at the University of Cambridge for two master’s degrees, one in History and Philosophy of Science and one in International Relations and Politics, with dissertations on early modern state-sponsored science and Cold War nuclear strategy.

Christian was a member of the 2021 Project on Nuclear Issues (PONI) Nuclear Scholars Initiative, serves on the External Advisory Board of the Berkeley Risk and Security Lab (BRSL), and is a Mentor for summer fellows at the Cambridge Existential Risks Initiative (CERI). His writing has appeared in The Atlantic, the Bulletin of the Atomic Scientists, Foreign Policy, and more.

Learn More

Asset thumbnail
Theories of Change for Track 2 Diplomacy
Learn more
Asset thumbnail
New research and recommendations on Advanced AI
Learn more
Asset thumbnail
Great power competition and transformative technologies report
Learn more
A chemical, biological, radiological, and nuclear (CBRN) exercise with the Support Forces of Ukraine.
Global Catastrophic Biological Risks: A guide for Philanthropists
Learn more
Asset thumbnail
Global Catastrophic Nuclear Risk: A guide for philanthropists
Learn more
Asset thumbnail
Great power conflict report
Learn more
Asset thumbnail
Call me, maybe? Hotlines and global catastrophic risk report
Learn more