First class is on 1/29
This is an introductory course, so prior familiarity with ideas and material in the field of AI Governance is not required.
In this course, we examine risks posed by increasingly advanced AI systems, ideas for addressing these risks through standards and regulation, and foreign policy approaches to promoting AI safety internationally. There is much uncertainty and debate about these issues, but the following resources offer a useful entry point for understanding how to reduce extreme risks in the current AI governance landscape.
First, we'll examine the risks posed by advances in AI. Machine learning has advanced rapidly in recent years. Further advances threaten various catastrophic risks, including powerful AIs in conflict with human interests, AI-assisted bioterrorism, and AI-exacerbated conflict.
Standards and regulations could help address extreme risks from frontier AI. They could include model evaluations and security measures. One approach to reducing risks from states that do not regulate AI is for a cautious coalition of countries to lead in AI (e.g., through hardware export controls, infosecurity, and immigration policy) and use this lead to reduce risks. Another approach is expanding the cautious coalition, which may be doable through treaties with privacy-preserving verification mechanisms. We'll provide resources to help you to examine both approaches.
Other prominent (sometimes complementary) AI governance ideas include lab governance, mitigating AGI misuse; slowing down AI now; and "CERN for AI." We'll briefly provide an overview of these approaches so that you know what others are discussing.
The final week discusses ways you can contribute through policy work, strategy research, "technical governance" work, and other paths.
There will be one in-person meeting each week where students will review and discuss that week’s content in small groups. Each week’s material includes mandatory readings (60-90 min per week). Most weeks, there will be short reflections due on assigned reading before meetings. We expect each section to take 2 hours to engage with all the materials. Additionally, there are exercises to help you think through topics yourself and make progress in your learning about AI alignment. We organize content into weeks to help you structure and pace your engagement with the curriculum.
Students will receive credit for attendance at weekly meetings and completion of core readings and reflections. Students are allowed at most 2 unexcused absences. Staff will be accommodating about reasons for absences. Two reflections may be missed without a grade penalty.
Requirements to Pass
In order to pass, students must:
Have at most 2 unexcused absences from weekly meetings.
Fill out all reflection posts
Complete the final essay with a 70% mark.
No day(s) left until application deadline!