Eliezer Yudkowsky
Eliezer Yudkowsky
Eliezer Shlomo Yudkowskyis an American artificial intelligence researcher known for popularizing the idea of friendly artificial intelligence. He is a co-founder and research fellow at the Machine Intelligence Research Institute, a private research nonprofit based in Berkeley, California...
NationalityAmerican
ProfessionWriter
Date of Birth11 September 1979
CountryUnited States of America
believe people world
What people really believe doesn't feel like a BELIEF, it feels like the way the world IS.
technology years historical
Since the rise of Homo sapiens, human beings have been the smartest minds around. But very shortly - on a historical scale, that is - we can expect technology to break the upper bound on intelligence that has held for the last few tens of thousands of years.
humble modesty rationality
To confess your fallibility and then do nothing about it is not humble; it is boasting of your modesty.
cancer thinking dying
If you've been cryocrastinating, putting off signing up for cryonics "until later", don't think that you've "gotten away with it so far". Many worlds, remember? There are branched versions of you that are dying of cancer, and not signed up for cryonics, and it's too late for them to get life insurance.
smart book skills
There's a popular concept of 'intelligence' as book smarts, like calculus or chess, as opposed to, say, social skills. So people say that 'it takes more than intelligence to succeed in human society.' But social skills reside in the brain, not the kidneys.
thinking differences individual
We tend to see individual differences instead of human universals. Thus, when someone says the word 'intelligence,' we think of Einstein instead of humans.
machines research charity
I am a full-time Research Fellow at the Machine Intelligence Research Institute, a small 501(c)(3) public charity supported primarily by individual donations.
annoyed suffering world
Existential depression has always annoyed me; it is one of the world's most pointless forms of suffering.
running decision process
After all, if you had the complete decision process, you could run it as an AI, and I'd be coding it up right now.
fall years choices
You couldn't changed history. But you could get it right to start with. Do something differently the FIRST time around. This whole business with seeking Slytherin's secrets... seemed an awful lot like the sort of thing where, years later, you would look back and say, 'And THAT was where it all started to go wrong.' And he would wish desperately for the ability to fall back through time and make a different choice. Wish granted. Now what?
cute civilization trying
If you want to maximize your expected utility, you try to save the world and the future of intergalactic civilization instead of donating your money to the society for curing rare diseases and cute puppies.
thinking want organised
I don't want to rule the universe. I just think it could be more sensibly organised.
book want i-can
I only want power so I can get books.