Eliezer Yudkowsky
![Eliezer Yudkowsky](/assets/img/authors/eliezer-yudkowsky.jpg)
Eliezer Yudkowsky
Eliezer Shlomo Yudkowskyis an American artificial intelligence researcher known for popularizing the idea of friendly artificial intelligence. He is a co-founder and research fellow at the Machine Intelligence Research Institute, a private research nonprofit based in Berkeley, California...
NationalityAmerican
ProfessionWriter
Date of Birth11 September 1979
CountryUnited States of America
machines research charity
I am a full-time Research Fellow at the Machine Intelligence Research Institute, a small 501(c)(3) public charity supported primarily by individual donations.
cancer thinking dying
If you've been cryocrastinating, putting off signing up for cryonics "until later", don't think that you've "gotten away with it so far". Many worlds, remember? There are branched versions of you that are dying of cancer, and not signed up for cryonics, and it's too late for them to get life insurance.
smart book skills
There's a popular concept of 'intelligence' as book smarts, like calculus or chess, as opposed to, say, social skills. So people say that 'it takes more than intelligence to succeed in human society.' But social skills reside in the brain, not the kidneys.
thinking differences individual
We tend to see individual differences instead of human universals. Thus, when someone says the word 'intelligence,' we think of Einstein instead of humans.
annoyed suffering world
Existential depression has always annoyed me; it is one of the world's most pointless forms of suffering.
fall years choices
You couldn't changed history. But you could get it right to start with. Do something differently the FIRST time around. This whole business with seeking Slytherin's secrets... seemed an awful lot like the sort of thing where, years later, you would look back and say, 'And THAT was where it all started to go wrong.' And he would wish desperately for the ability to fall back through time and make a different choice. Wish granted. Now what?
cute civilization trying
If you want to maximize your expected utility, you try to save the world and the future of intergalactic civilization instead of donating your money to the society for curing rare diseases and cute puppies.
thinking want organised
I don't want to rule the universe. I just think it could be more sensibly organised.
book want i-can
I only want power so I can get books.
dark shining fool
This is one of theprimary mechanisms whereby, if a fool says the sun is shining, we do notcorrectly discard this as irrelevant nonevidence, but rather find ourselvesimpelled to say that it must be dark outside.
book math thinking
He'd met other prodigies in mathematical competitions. In fact he'd been thoroughly trounced by competitors who probably spent literally all day practising maths problems and who'd never read a science-fiction book and who would burn out completely before puberty and never amount to anything in their future lives because they'd just practised known techniques instead of learning to think creatively.
children crazy despair
Like that's the only reason anyone would ever buy a first-aid kit? Don't take this the wrong way, Professor McGonagall, but what sort of crazy children are you used to dealing with?" "Gryffindors," spat Professor McGonagall, the word carrying a freight of bitterness and despair that fell like an eternal curse on all youthful heroism and high spirits.
space somewhere-else okay
Okay, so either (a) I just teleported somewhere else entirely (b) they can fold space like no one's business or (c) they are simply ignoring all the rules.
fall ideas keys
The purest case of an intelligence explosion would be an Artificial Intelligence rewriting its own source code. The key idea is that if you can improve intelligence even a little, the process accelerates. It's a tipping point. Like trying to balance a pen on one end - as soon as it tilts even a little, it quickly falls the rest of the way.