Eliezer Yudkowsky

Eliezer Yudkowsky bigraphy, stories - American Artificial Intelligence researcher

Eliezer Yudkowsky : biography

11 September 1979 –

Eliezer Shlomo Yudkowsky (born September 11, 1979) is an American blogger, writer, and advocate for friendly artificial intelligence, with an interest in decision theory.


Yudkowsky, a resident of Berkeley, California, has no formal education in computer science or artificial intelligence.Singularity Rising, by James Miller, page 35 He co-founded the nonprofit Singularity Institute for Artificial Intelligence (currently the Machine Intelligence Research Institute) in 2000 and continues to be employed there as a full-time Research Fellow.


Yudkowsky’s interests focus on Artificial Intelligence theory for self-understanding, self-modification, and recursive self-improvement (seed AI), and on artificial-intelligence architectures and decision theories for stable motivational structures (Friendly AI and Coherent Extrapolated Volition in particular). Apart from his research work, Yudkowsky has written explanations of various philosophical topics in non-academic language, particularly on rationality, such as "An Intuitive Explanation of Bayes’ Theorem". He is also known for his AI Box experiment


Yudkowsky was, along with Robin Hanson, one of the principal contributors to the blog Overcoming Bias sponsored by the Future of Humanity Institute of Oxford University. In early 2009, he helped to found Less Wrong, a "community blog devoted to refining the art of human rationality". The Sequences on Less Wrong, comprising over two years of blog posts on epistemology, Artificial Intelligence, and metaethics, form the largest bulk of Yudkowsky’s writing.

He contributed two chapters to Oxford philosopher Nick Bostrom’s and Milan Ćirković’s edited volume Global Catastrophic Risks, and "Complex Value Systems are Required to Realize Valuable Futures" to the conference AGI-11.

Yudkowsky is the author of the Singularity Institute publications "Creating Friendly AI" (2001), "Levels of Organization in General Intelligence" (2002), "Coherent Extrapolated Volition" (2004), and "Timeless Decision Theory" (2010).

Yudkowsky has also written several works of science fiction and other fiction. His Harry Potter fan fiction story Harry Potter and the Methods of Rationality illustrates topics in cognitive science and rationality (The New Yorker described it as "a thousand-page online ‘fanfic’ text called ‘Harry Potter and the Methods of Rationality’, which recasts the original story in an attempt to explain Harry’s wizardry through the scientific method"pg 54, ), and has been reviewed by authors David Brin, Daniel Snyder, The Atlantichttp://davidbrin.blogspot.com/2013/02/science-fiction-and-our-duty-to-past.html and Rachel Aaron, Robin Hanson, Aaron Swartz, and by programmer Eric S. Raymond.