Take Action

There are a few hundred researchers in the world focused on preventing extinction from AI. You can make a real difference! Raise awareness, reach out to your government, get involved, learn more, and see how you can help with your specific skillset.

Raise awareness

  • Share on social media. You can share or retweet Stop AGI content, media articles, open letters like the Center for AI Safety’s statement, etc. 
  • Give a presentation where you work. It’s a helpful way to develop your understanding, find sympathetic people to collaborate with, or find skeptics who can sharpen your views.    
  • Organize discussions. Invite speakers, gather a group to read and discuss posts together, and participate in online discussions. 
  • Talk to people. In particular, if someone in your network has influence in tech, academia, or government, making them aware of AI risk could make a big difference.  
  • Write posts, make videos, or even create memes about AI risk ideas. If we like your content, we might share it on Stop AGI!  

Reach out to your government

  • Write your representatives in Congress/Parliament. We recommend using this template
  • Respond to government requests for comment and proposals. Government agencies often put out calls for input. Raising extinction risk to their attention can be really helpful! An example is the US Office of Science and Technology Policy request for information
  • Create a petition. In many places, petitions with enough signatures can directly lead to propositions being voted on or heavily influence regulation. 

Get involved 

  • Join the Stop AI* projects Discord. We’re helping coordinate people to create and translate content, make memes, organize campaigns, reach out to policymakers, and figure stuff out. 
  • Join a community discussing AI safety, or start one! Here’s a list of communities; many of them can advise on starting more. 
  • Get support from groups like AI Safety Support (provides resources, career advising, feedback on events, and even a health coach) or 80,000 Hours (does research, provides career advising, and has a podcast). 
  • Get oriented with this helpful map of the major researchers, bloggers, funders, etc. associated with mitigating AI extinction risk. 

Learn more

  • Check out AISafety.Info. It’s an interactive site with answers to tons of AI safety questions, like “Why can’t we just do X” and “How do I work on AI policy?” 
  • Read our recommended resources. There are tons of other helpful posts, podcasts, videos, articles, research papers, etc. 
  • Join an AI safety reading group. We recommend AGI Safety Fundamentals and AISafety.com. There are also local groups in many big cities and top universities. 

If you are...

Technical

  • Reach out to us! We may be able to connect you with upskilling/research programs and organizations hiring AI alignment researchers. 
  • Apply to internships, research programs, or jobs focused on mitigating AI extinction risk. We recommend checking out the research program SERI MATS and this job board.  
  • If you already work at a major AI company and your role is not focused on AI safety: 1. Talk to your colleagues about AI risk. 2. Quit and tell people why. Geoffrey Hinton had a huge impact by leaving Google to speak openly about AI risk; you can do the same. 

A journalist or content creator

An executive or government official 

What not to do

Technical

  • Don’t join AI companies and work on developing AI. There’s a history of people saying they’re working on AI to make the world better, and this making it worse. Don’t accelerate the problem you’re trying to solve. If you’re technically talented, there are tons of AI alignment research directions you can pursue that don’t accelerate AI capabilities. 
  • Don’t participate in destructive protests or take destructive actions. Again, we don’t want to cause more harm than good; it’s really easy to accidentally do this. In general, get feedback before taking on significant projects to mitigate this risk.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.