Eliezer Yudkowsky Quotes and Sayings - Page 1
More Eliezer Yudkowsky quote about:
-
“Many have stood their ground and faced the darkness when it comes for them. Fewer come for the darkness and force it to face them.”
-- Eliezer Yudkowsky -
“You are personally responsible for becoming more ethical than the society you grew up in.”
-- Eliezer Yudkowsky -
“Trying and getting hurt can't possibly be worse for you than being... stuck.”
-- Eliezer Yudkowsky -
“The police officer who puts their life on the line with no superpowers, no X-Ray vision, no super-strength, no ability to fly, and above all no invulnerability to bullets, reveals far greater virtue than Superman—who is only a mere superhero.”
-- Eliezer Yudkowsky -
“Not every change is an improvement but every improvement is a change; you can't do anything BETTER unless you can manage to do it DIFFERENTLY, you've got to let yourself do better than other people!”
-- Eliezer Yudkowsky -
“If dragons were common, and you could look at one in the zoo - but zebras were a rare legendary creature that had finally been decided to be mythical - then there's a certain sort of person who would ignore dragons, who would never bother to look at dragons, and chase after rumors of zebras. The grass is always greener on the other side of reality. Which is rather setting ourselves up for eternal disappointment, eh? If we cannot take joy in the merely real, our lives shall be empty indeed.”
-- Eliezer Yudkowsky -
“Through rationality we shall become awesome, and invent and test systematic methods for making people awesome, and plot to optimize everything in sight, and the more fun we have the more people will want to join us.”
-- Eliezer Yudkowsky -
“Let the winds of evidence blow you about as though you are a leaf, with no direction of your own. Beware lest you fight a rearguard retreat against the evidence, grudgingly conceding each foot of ground only when forced, feeling cheated. Surrender to the truth as quickly as you can.”
-- Eliezer Yudkowsky -
“If you want to maximize your expected utility, you try to save the world and the future of intergalactic civilization instead of donating your money to the society for curing rare diseases and cute puppies.”
-- Eliezer Yudkowsky -
“A burning itch to know is higher than a solemn vow to pursue truth. To feel the burning itch of curiosity requires both that you be ignorant, and that you desire to relinquish your ignorance.”
-- Eliezer Yudkowsky -
“It is triple ultra forbidden to respond to criticism with violence. There are a very few injunctions in the human art of rationality that have no ifs, ands, buts, or escape clauses. This is one of them. Bad argument gets counterargument. Does not get bullet. Never. Never ever never for ever.”
-- Eliezer Yudkowsky -
“You cannot rationalize what is not rational to begin with - as if lying were called truthization. There is no way to obtain more truth for a proposition by bribery, flattery, or the most passionate argument - you can make more people believe the proposition, but you cannot make it more true.”
-- Eliezer Yudkowsky -
“If people got hit on the head by a baseball bat every week, pretty soon they would invent reasons why getting hit on the head with a baseball bat was a good thing.”
-- Eliezer Yudkowsky -
“Most Muggles lived in a world defined by the limits of what you could do with cars and telephones. Even though Muggle physics explicitly permitted possibilities like molecular nanotechnology or the Penrose process for extracting energy from black holes, most people filed that away in the same section of their brain that stored fairy tales and history books, well away from their personal realities: Long ago and far away, ever so long ago.”
-- Eliezer Yudkowsky -
“Have I ever remarked on how completely ridiculous it is to ask high school students to decide what they want to do with the rest of their lives and give them nearly no support in doing so? Support like, say, spending a day apiece watching twenty different jobs and then another week at their top three choices, with salary charts and projections and probabilities of graduating that subject given their test scores? The more so considering this is a central allocation question for the entire economy?”
-- Eliezer Yudkowsky -
“Moore's Law of Mad Science: Every eighteen months, the minimum IQ necessary to destroy the world drops by one point.”
-- Eliezer Yudkowsky -
“I ask the fundamental question of rationality: Why do you believe what you believe? What do you think you know and how do you think you know it?”
-- Eliezer Yudkowsky -
“Do not flinch from experiences that might destroy your beliefs. The thought you cannot think controls you more than thoughts you speak aloud. Submit yourself to ordeals and test yourself in fire. Relinquish the emotion which rests upon a mistaken belief, and seek to feel fully that emotion which fits the facts.”
-- Eliezer Yudkowsky -
“If you've been cryocrastinating, putting off signing up for cryonics "until later", don't think that you've "gotten away with it so far". Many worlds, remember? There are branched versions of you that are dying of cancer, and not signed up for cryonics, and it's too late for them to get life insurance.”
-- Eliezer Yudkowsky -
“Science has heroes, but no gods. The great Names are not our superiors, or even our rivals, they are passed milestones on our road; and the most important milestone is the hero yet to come.”
-- Eliezer Yudkowsky -
“There are no surprising facts, only models that are surprised by facts; and if a model is surprised by the facts, it is no credit to that model.”
-- Eliezer Yudkowsky -
“Crocker's Rules didn't give you the right to say anything offensive, but other people could say potentially offensive things to you , and it was your responsibility not to be offended. This was surprisingly hard to explain to people; many people would read the careful explanation and hear, "Crocker's Rules mean you can say offensive things to other people.”
-- Eliezer Yudkowsky -
“Lonely dissent doesn't feel like going to school dressed in black. It feels like going to school wearing a clown suit.”
-- Eliezer Yudkowsky -
“The human brain cannot release enough neurotransmitters to feel emotion a thousand times as strong as the grief of one funeral. A prospective risk going from 10,000,000 deaths to 100,000,000 deaths does not multiply by ten the strength of our determination to stop it. It adds one more zero on paper for our eyes to glaze over.”
-- Eliezer Yudkowsky -
“Litmus test: If you can't describe Ricardo 's Law of Comparative Advantage and explain why people find it counterintuitive, you don't know enough about economics to direct any criticism or praise at " capitalism " because you don't know what other people are referring to when they use that word .”
-- Eliezer Yudkowsky -
“If you want to build a recursively self-improving AI, have it go through a billion sequential self-modifications, become vastly smarter than you, and not die, you've got to work to a pretty precise standard.”
-- Eliezer Yudkowsky -
“The people I know who seem to make unusual efforts at rationality, are unusually honest, or, failing that, at least have unusually bad social skills.”
-- Eliezer Yudkowsky -
“We underestimate the distance between ourselves and others. Not just inferential distance, but distances of temperament and ability, distances of situation and resource, distances of unspoken knowledge and unnoticed skills and luck, distances of interior landscape.”
-- Eliezer Yudkowsky -
“If I'm teaching deep things, then I view it as important to make people feel like they're learning deep things, because otherwise, they will still have a hole in their mind for "deep truths" that needs filling, and they will go off and fill their heads with complete nonsense that has been written in a more satisfying style.”
-- Eliezer Yudkowsky -
“By and large, the answer to the question "How do large institutions survive?" is "They don't!" The vast majority of large modern-day institutions some of them extremely vital to the functioning of our complex civilization simply fail to exist in the first place.”
-- Eliezer Yudkowsky
You may also like:
-
Aubrey de Grey
Author -
Ben Goertzel
Researcher -
David Brin
Scientist -
David Chalmers
Philosopher -
Donald J. Boudreaux
Economist -
Douglas Hofstadter
Professor -
I. J. Good
Mathematician -
John Poindexter
Armed force officer -
K. Eric Drexler
Engineer -
Marvin Minsky
Scientist -
Massimo Pigliucci
Professor -
Max Tegmark
Professor -
Nick Bostrom
Philosopher -
Peter Thiel
Entrepreneur -
Ray Kurzweil
Author -
Sam Harris
Author -
Vernor Vinge
Computer Scientist