
Center for AI Safety
1 open role
description
The Center for AI Safety (CAIS) is a nonprofit working on reducing high-consequence risks from artificial intelligence through technical research and field-building. They conduct research into topics in AI safety, and help build the field of AI safety research by providing compute for researchers, and running fellowships and courses.
Open roles
Learn more
80,000 Hours links
External content
CAIS' work overview
Approach
More info on CAIS' field-building activities
Information
Details about CAIS' 7-month philosophy research fellowship scheme
Opportunity