Less Wrong
"Uncertainty exists in the map, not in the territory."
Less Wrong is a community blog devoted to rationality. Contributors draw upon many scientific disciplines for their posts, from quantum physics and Bayesian probability to psychology and sociology. The blog focuses on human flaws that lead to misconceptions about the sciences. It's a gold mine for interesting ideas and unusual views on any subject. The clear writing style makes complex ideas easy to understand.
The mainstream community on Less Wrong is firmly atheistic. A good number of contributors are computer professionals. Some, like founder Eliezer Yudkowsky, work in the field of Artificial Intelligence; particularly, Less Wrong has roots in Yudkowsky's effort to design “Friendly AI”[1], and as a result often uses AI or transhumanist elements in examples (though this is also so as to speak of minds-in-general, as contrasted with our particular human minds).
- Three Worlds Collide is hosted here.
- Harry Potter and the Methods of Rationality is occasionally discussed here.
- Back from the Dead: Some in the Less Wrong community hope to achieve this through cryonics.
- Blue and Orange Morality: One of the core concepts of Friendly AI is that it's entirely possible to make something as capable as a human being that has completely alien goals. Luckily, there's already an example of an 'optimization process' completely unlike a human mind right here on Earth that we can use to see how good we are at truly understanding the concept.
"The AI does not hate you, nor does it love you, but you are made out of atoms which it can use for something else."
- Concepts Are Cheap: Applause Lights.
- Deus Est Machina: Yudkowsky and some other members of Less Wrong from the Singularity Institute for Artificial Intelligence are working on making one. Singularity is eagerly awaited.
- Hollywood Atheist: Most often averted, but there may be some who act like the Jerkass variety. Religion is rarely a topic of discussion, as the social baseline is strongly atheist (though a spirited early-2011 thread briefly brought it up).
- Humans Are Flawed: As a result of having been 'designed' slowly and very much imperfectly by the 'idiot god' that is evolution.
- Living Forever Is Awesome: Almost everyone on Less Wrong. Hence, the strong Transhumanist bent.
- Logic Failure: Revealed to be shockingly common for normal human minds, and something for rationalists to avoid.
- Phrase Catcher: The Flame Bait topic of politics is met with "politics is the mind-killer".
- Talking Your Way Out: The AI-Box Experiment.
- Transhumanism: Their philosophy and goal.
- Straw Vulcan: Averted. Less Wrong community members do not consider rationality to *necessarily* be at odds with emotion. Also, Spock is a terrible rationalist.
- Wiki Walk: It is fairly easy to go on one due to the links in the articles to other articles. Also, certain lines of thought about similar issues are organized into 'sequences' which make them more conveniently accessible.