No, there are no fees associated with the Alignment Fellowship.
The Alignment Fellowship
How do we navigate the next decade safely?
How do we navigate the next decade safely?
Source: Mollick
Advanced AI systems are improving rapidly. Within years, we may create systems more capable than humans across most domains. If that happens, the consequences will be profound. In the best case, we solve problems we've struggled with for centuries. In the worst case, we lose control entirely.
The Alignment Fellowship is a condensed five-week course that introduces curious and highly motivated students and working professionals to the fundamental concepts in AI safety. Each week, you'll do assigned readings and meet to discuss them with a small cohort. Depending on your background and interest, we offer two tracks: a technical track and a governance track. Both tracks share foundational content: understanding how AI systems might become misaligned with human goals, how they might acquire power, and what's at stake if they do. From there, the tracks go into their own focus:
Best suited for those with backgrounds in computer science, mathematics, physics, or other quantitative fields.
What you'll cover:
Best suited for those interested in policy, international relations, law, or the political economy of technology.
What you'll cover:
Alongside that:
From researchers at Redwood Research, UK AI Security Institute, Apollo Research and more.
One-on-one sessions to help you understand your next steps towards an AI safety career.
You will be embedded in the Meridian ecosystem, with opportunities to meet participants from ERA:AI, ERA:AIxBiosecurity, the Visiting Researchers Programme, and more.
The programme runs in-person in Cambridge, UK from 2 February to 6 March 2025. Meals will be provided for each session.
We invest significant time and resources into each cohort: talks from leading researchers, one-on-one career support, dinners, and access to the Cambridge AI safety community. In return, we ask that you invest too.
Each week has core readings and optional deeper dives. We expect you to complete 1–1.5 hours of core readings before each session and come ready to discuss them.
Attend the sessions, talks, and workshops we organise.
This is your chance to figure out whether AI safety matters to you, what role you might play in mitigating the risks.
If you're unable to make the commitment, we'd still love to have you engage with us. You can join our mailing list to hear about our weekly events, public talks, and other activities throughout the year.
15 minute application. Selected candidates will be invited to an asynchronous interview.
Rolling admissions. We'll close the form once spots fill up, so apply sooner rather than later.
Final deadline: Sunday, 25 January (end of day GMT)
No, there are no fees associated with the Alignment Fellowship.
Yes. The Alignment Fellowship is hosted in-person in Cambridge, UK. We expect almost all applicants to already be based in either Cambridge or London.
For London-based participants, we may be able to cover travel expenses from London to Cambridge if it would otherwise prohibit you from joining the programme.
We expect participants to commit roughly 3–5 hours per week across workshops, pre-readings & assignments, and social events.
When applying, we will ask you for your availability. Weekly meeting times will be finalised in the weeks leading up to the first session. We will choose times that maximise the number of participants able to attend.