Screencap: "1. We want AGI to empower humanity to maximally flourish in the universe. We don’t expect the future to be an unqualified utopia, but we want to maximize the good and minimize the bad, and for AGI to be an amplifier of humanity. 2. We want the benefits of, access to, and governance of AGI to be widely and fairly shared. 3. We want to successfully navigate massive risks. In confronting these risks, we acknowledge that what seems right in theory often plays out more strangely than expected in practice. We believe we have to continuously learn and adapt by deploying less powerful versions of the technology in order to minimize “one shot to get it right” scenarios."
https://cdn.masto.host/daircommunitysocial/media_attachments/files/109/938/097/016/396/063/original/9c09f18edf235cda.png