Scott Alexander

Scott Alexander (1984–) is the pen-name of LessWrong-rationalist blogger and psychiatrist Scott Alexander Siskind. After graduating with a bachelor’s degree magna cum laude in Philosophy,[1] he gained an MD, and then completed a residencyFile:Wikipedia's W.svg as a psychiatrist-in-training.

Carefully, correctly
LessWrong
Singularity blues
v - t - e

He began writing on Less Wrong under the name Yvain, and then branched out into his own blog, Slate Star Codex (a near-anagram of "Scott Alexander"). SSC became one of the top-tier blogs for LessWrong-style rationalists, this and his related Tumblr being linchpins of the LessWrong Diaspora.[2] His current blog is Astral Codex Ten.

As is customary in the writing of psychiatrists and psychologists, he mashes up details of different patients when he writes about them, so as to fictionalize the accounts and avoid his patients being identified, as well as using a pseudonym himself.

He blogs on many subjects other than psychiatry. He is particularly fond of eugenics, which he expressly advocates a positive version of.

SSC posts tend to range from long to extremely long. Alexander uses Twitter[3] and Tumblr[4] to post short/frivolous posts and puns.

Notable internet publications include his giant anti-neoreactionary FAQ,[5] his giant anti-libertarian FAQ,[6] his map of the rationalist blogosphere,[7] and a long collection of quotations from actual computer scientists on the subject of why we should take AI risk seriously.[8] Additionally, he posted a lengthy and famed criticism[9] of feminism, which had been spurred by a feminist backlash[10][11] against a blog comment[12] by MIT professor Scott Aaronson.

Alexander is a frequent visitor of local LessWrong-rationalist meetups[13] in the US, and organizes some of them himself.

The Slate Star Codex blog was taken down on 23 June 2020, on the apparent basis that a New York Times article on the Slate Star Codex subculture by Cade Metz was going to use Alexander's real name, and he feared for his safety as he had been harassed previously at work over his blogging. (This just happened to coincide with the reporter starting to get in touch with SSC critics, and not just cheerleaders.[14]) Alexander recommended readers move to the Reddit forums /r/slatestarcodex and /r/themotte.[15]

Political and social views

He does not always censor racist and sexist opinions in his comments section, which some of his fellow LessWrong-style rationalists have a problem with[16] (except on open threads, where race and gender discussions are always banned).

Iranian secularist Kaveh Mousavi, while agreeing with Alexander that the intellectually bankrupt sections of the social justice community should be heavily critiqued, has nonetheless criticized Alexander himself for having an Americentric view of social issues and of creating a false equivalence between social justice advocates and social conservatives, as well as of downplaying discrimination against women and minorities in Western countries.[17] It is worth noting, though, that Alexander has been willing to defend parts of social justice he views as worthwhile, such as uses for trigger warnings[18] and acknowledging discrimination still exists and has massive economic costs.[19]

Neoreaction and racialism

See the main articles on this topic: Neoreactionary movement and Racialism

Alexander is critical of neoreactionaries, having written what is generally regarded as the definitive takedown of neoreaction,[5] though, per the header, he later took back some of the points he made in it. His blogroll is full of neoreactionaries and his comment section contains a lot of neoreactionary discussion, because he knows a pile of them personally, and he keeps discussing their ideas in his blog and for example, he considers Mencius Moldbug's Unqualified Reservations blog an obvious go-to reference his readers will immediately understand when he's talking about gay relationship counselling.[20]

In 2014, Siskind sent an email that described how he thought that RationalWiki was uniformative, while at the same time describing the benefits of reading neoreactionaries and racialist (HBD) proponents.[21]

Feminism

Alexander does not identify as a feminist or an anti-feminist,[22] but feels like he has been unfairly associated with both.

He talked of "the sane 30%-or-so of feminists"[23][note 1] and described some essays as “blurring the already thin line between feminism and literally Voldemort”.[25] He apparently regrets the popularity of this phrase, saying "NO NEED TO TAKE THIS ONE SENTENCE OUT OF CONTEXT AND TRY TO SPREAD IT ALL OVER THE INTERNET", though it really doesn't improve at all with context.[note 2]

In the post meant to clarify his position on feminism and feminist issues "SSC on Feminism", he described his negative attitudes towards the movement as:

I think there’s a whole corner of Internet feminism – the Jezebel, Gawker, and Modal Tumblr User faction – which is really scary. …

This strain is absolutely not the entirety of the movement – but it has become a big enough piece of the movement, and sufficiently dangerous to anybody who doesn’t share their views, that I think it really needs talking about and can’t be dismissed as “a few bad apples”. …

I will sometimes complain about “feminists” in a way that doesn’t necessarily mean the millions of feminists who follow good discussion norms and treat other people with respect. I’m trying to generalize less now and be much more precise about how I mean only a certain strain, but I have left the older posts untouched.

Alexander enthusiastically supported James Damore's Google's Ideological Echo Chamber: How Bias Clouds Our Thinking About Diversity and InclusionFile:Wikipedia's W.svg which suggested that the gender imbalance in tech fields was at least partly due to a greater proportion of men than women having the kinds of interests, inclinations, and talents that drew them to tech jobs, and suggested that changing Google's requirements for tech jobs might be needed to attract more women. This was after Alexander's fans on Reddit /r/slatestarcodex had spotted[26] that it was largely a restatement of Alexander's arguments in his post "Gender Imbalances Are Mostly Not Due To Offensive Attitudes".[27] Alexander wrote:[28]

And if you're reading this — sorry, huge respect for what you're trying, but it's pretty doomed. The best hope is a Fabian strategy of making sure enough there's enough of an underground of people who know what's up that they can quietly self-sort, form bubbles of liveability, and curb the worst excesses without forming a clear target for anybody. If you actually go riding in on a white horse waving a paper marked "ANTI-DIVERSITY MANIFESTO", you're just providing justification for the next round of purges.

Libertarianism

I feel pretty okay about both being sort of a libertarian and writing an essay arguing against libertarianism, because the world generally isn't libertarian enough but the sorts of people who read long online political essays generally are way more libertarian than can possibly be healthy.[29]

Communism

He is highly critical of communism, and has more generally been persistently critical of what he views as millenarian ideologies, i.e., a catastrophe will destroy the current system, handwave, a new Golden Age will arise from the ashes.[30] Much as with neoreaction, this hasn't stopped him from writing long book reports and getting very interested, for example, in the details of central planning in the USSR.[31]

Existential risks

Alexander believes that the risks of superintelligent AIs (e.g. the risk of them misconstruing our goals and turning us all into paperclips) have been repeatedly misrepresented and downplayed by the media, that while immediate disaster is unlikely, the threat is worth taking seriously, and now is a good time to research it.[32]

However, Alexander, who echoes the views of Machine Intelligence Research Institute (MIRI), Stephen Hawking, Elon Musk and Nick Bostrom on this, is not an AI researcher, nor a computer scientist (and the same goes for most of the "researchers" at MIRI, including Eliezer Yudkowsky). An actual AI researcher, Richard Loosemore, has criticized the assumptions behind many of the MIRI-style superintelligent AI doomsday scenarios, pointing out that an AI that thought it could correctly interpret the core goals of humanity but got them so hideously wrong would not in fact be worthy of the name "intelligent" at all, and that this is not merely a naming issue but a basic design issue for AIs.[33] Alexander has compiled a list of renowned and accomplished AI researchers expressing concerns about AI.[8]

Effective altruism

Alexander is a big supporter of charity on similar grounds and often gives speeches on efficient charity,[34] and currently supports the Giving What We Can project which attempts to separate effective charities from inefficient ones.[35]

Race and IQ

Alexander identifies with the 'hereditarian left',[36] and considers The Bell Curve co-author Charles Murray to be a close ideological ally.[37][38] He has also expressed support for Gregory Cochran and Henry Harpending's hypothesis that the frequency of congenital diseases among Ashkenazi Jews (of which Alexander is one) is caused by selection for intelligence,[39] as opposed to the multiple bottlenecks and founder effects for which there is actual evidence.[40] There is almost nothing he won't try to apply human biodiversity to, e.g. Harry Potter.[41]

The Slate Star Codex comments section and the /r/slatestarcodex subreddit are even more extreme on this issue, with Cochran, Steve Sailer and Emil O. W. Kirkegaard all having taken part in the discussion.

After the NYT article on Slate Star Codex was published,[42] Scott's fans were outraged that they could dare smear Scott by association with scientific racists. After about a week of that, Scott's ex-friend Topher Brennan (who used to blog as Christopher Hallquist, his name before marriage) posted the receipts: an email from 2014 in which Scott earnestly pleads with Topher to take up the banner of racialism, in its Human Biodiversity variant, explicitly recommending such luminaries as John G.R. Fuerst, Steve Sailer, Hbdchick, and various neoreactionaries: "I am monitoring Reactionaries to try to take advantage of their insight and learn from them."[43]

/r/slatestarcodex

As usual, you can make anything worse by adding Reddit. /r/slatestarcodex is an unofficial fan forum for the blog. Scott comments occasionally and is a moderator. The culture wars (a regular weekly thread, until it was recently branched off to the Scott-endorsed /r/themotte) and pseudoscientific racialism of "human biodiversity" are regular and upvoted topics (literally advocating the Fourteen Words will get you 40+ upvotes[44] and admiring replies). Of course, much more offensive than the racism is objecting to the racism, which gets you a day's ban.[45] According to one moderator, "A belief in HBD doesn’t automatically equate to racism", somehow.

The moderators have a partial registry of bans.[46]

After pressure from his friends, Alexander banned "culture war" discussions from /r/slatestarcodex and moved them to a new subreddit, /r/themotte — which is now a haven for race realism and for the sort of white nationalism that thinks it's erudite. Alexander simultaneously disclaimed /r/themotte and kept recommending it.

Dark Enlightenment philosopher Nick Land's 2014 psychological horror novella Phyl-Undhu includes a technological cult reminiscent of LessWrong, and a character called "Alex Scott" expressing some of Scott's ideas on the Doomsday Hypothesis, with an intelligence at the end of time you can communicate with, and a cultist pushed out of the cult who "wants to have not thought certain things."

gollark: Is anyone else having issues getting the thing to actually work?
gollark: I mean, the cloud providers are all horribly expensive for hobbyist use.
gollark: Colab? Various cloud providers?
gollark: That's what they want you to think, of course.
gollark: Yes, but only since last Thursday.

Notes

  1. He specified in the comments: The word “sane” in that context should not be taken to mean “stupid” or even “holds stupid views”, but rather “willing to hold rational discussions about their views with someone they are tempted to consider an evil enemy, based on the Principle of Charity”[24] The evil enemy in question being neoreactionaries.
  2. I dunno, you fuck one pony.

References

  1. Five Years And One Week Of Less Wrong by Scott Alexander (March 13, 2014) Slate Star Codex (archived from March 4, 2020).
  2. Rationalist movement LessWrong WIki (archived from March 26, 2020).
  3. Scott Alexander Twitter (archived from April 18, 2020).
  4. Slate Star Scratchpad Tumblr (archived from June 23, 2020).
  5. The Anti-Reactionary FAQ by Scott Alexander (October 20, 2013 ) Slate Star Codex (archived from June 18, 2020).
  6. The Non-Libertarian FAQ by Scott Alexander (February 22, 2017) Slate Star Codex (archived from February 22, 2020).
  7. Mapmaker, Mapmaker, Make Me a Map by Scott Alexander (September 5, 2014) Slate Star Codex (archived from March 3, 2020).
  8. AI Researchers On AI Risk by Scott Alexander (May 22, 2015) Slate Star Codex (archived from April 26, 2020).
  9. Untitled by Scott Alexander (January 1, 2015) Slate Star Codex (archived from February 21, 2020).
  10. On Nerd Entitlement: White male nerds need to recognise that other people had traumatic upbringings, too - and that’s different from structural oppression. by Laurie Penny (29 December 2014) New Statesman.
  11. Amanda Marcotte (December 30, 2014). "MIT professor explains: The real oppression is having to learn to talk to women".
  12. Walter Lewin: Comment #171 by Scott Aaronson (December 14th, 2014 at 10:21 pm) Shtetl-Optimized: The Blog of Scott Aaronson.
  13. Less Wrong meetup groups LessWrong Wiki.
  14. I did recommend that the reporter talk to some people with expertise in neoreactionaries like @ElSandifer or in AI bias, it is interesting he'd delete his blog just as the reporter was going to start looking at more critical sources by Melissa McEwen (11:20 AM - 23 Jun 2020) Twitter (archived from June 23, 2020).
  15. NYT Is Threatening My Safety By Revealing My Real Name, So I Am Deleting The Blog by Scott Alexander (23 June 2020) Slate Star Codex (archived from 23 Jun 2020 08:43:57 UTC).
  16. How true is the statement 'the comment threads on Slate Star Codex are a nightmare to read through'? by Caio Camargo (December 25, 2014) Reddit.
  17. The Irregular Symmetry by Kaveh Mousavi (June 16, 2015) Patheos: On the Margins of Error.
  18. The Wonderful Thing about Triggers by Scott Alexander (May 30, 2014) Slate Star Codex (archived from February 14, 2020).
  19. Social Justice For The Highly-Demanding-Of-Rigor by Scott Alexander (April 20, 2013) Slate Star Codex (archived from May 2, 2020).
  20. Setting The Default by Scott Alexander (December 1, 2015) Slate Star Codex (archived from April 6, 2020).
  21. Backstabber Brennan knifes Scott Alexander with 2014 email by Topher Brennan (17. February 2021) Emil O. W. Kirkegaard (archived from February 17, 2021).
  22. SSC On Feminism Slate Star Codex (archived from December 30, 2015).
  23. The Anti-Reactionary FAQ by Scott Alexander (October 20, 2013 ) Slate Star Codex (archived from May 20, 2020).
  24. The Anti-Reactionary FAQ by Scott Alexander (October 20, 2013) Star Slate Codex (archived from April 18, 2020).
  25. Scott Alexander, Radicalizing the Romanceless. Slate Star Codex, August 31, 2014
  26. Did the SSC post on Gender Imbalances and Offensive Attitudes inspire the Google SWE's Anti-Diversity manifesto? by church_on_a_hill (05 Aug 2017) Slate Star Codex (archived from August 13, 2017).
  27. Gender Imbalances Are Mostly Not Due To Offensive Attitudes by Scott Alexander (August 1, 2017) Slate Star Codex (archived from January 8, 2020).
  28. Did the SSC post on Gender Imbalances and Offensive Attitudes inspire the Google SWE's Anti-Diversity manifesto? by Scott Alexander Feb 26, 2014) Reddit (archived from June 24, 2020).
  29. All Debates Are Bravery Debates by Scott Alexander (June 9, 2013) Slate Star Codex (archived from April 25, 2020).
  30. SSC Endorses Clinton, Johnson, Or Stein by Scott Alexander (September 28, 2016) Slate Star Codex (archived from June 4, 2020).
  31. Book Review: Red Plenty by Scott Alexander (September 24, 2014) Slate Star Codex (archived from February 5, 2020).
  32. No Time Like The Present For AI Safety Work by Scott Alexander (May 29, 2015) Slate Star Codex (archived from June 11, 2020).
  33. The Maverick Nanny with a Dopamine Drip: Debunking Fallacies in the Theory of AI Motivation by Richard Loosemore (Jul 24, 2014) Institute for Ethics and Emerging Technologies.
  34. Investment and Inefficient Charity by Scott Alexander (April 5, 2013) Slate Star Codex (archived from April 8, 2020).
  35. Giving What We Can
  36. Links 5/17: Rip Van Linkle by Scott Alexander (May 9, 2017) Slate Star Codex (archived from April 3, 2020).
  37. Three Great Articles On Poverty, And Why I Disagree With All Of Them by Scott Alexander (May 23, 2016) Slate Star Codex (archived from June 15, 2020).
  38. Clarification To "Sacred Principles As Exhaustible Resources" by Scott Alexander (April 12, 2017) Slate Star Codex (archived from January 11, 2020).
  39. The Atomic Bomb Considered As Hungarian High School Science Fair Project by Scott Alexander (May 26, 2017) Slate Star Codex (archived from February 17, 2020).
  40. A Population-Genetic Test of Founder Effects and Implications for Ashkenazi Jewish Diseases Montgomery Slatkin (2004) Am. J. Hum. Genet. 75(2): 282–293. doi:10.1086/423146.
  41. slatestarscratchpad by mugasofer (Jan 7th, 2016) Tumblr. This is despite apparently having never read them, but he can tell you all about Jensen.
  42. https://www.nytimes.com/2021/02/13/technology/slate-star-codex-rationalists.html
  43. Twitter thread (archive)
  44. What are some true beliefs deep down you knew were true but didn't believe at one time because they were too uncomfortable to accept? by u/Danplanck (Feb 26, 2014) Reddit (archived from February 23, 2019).
  45. Slate Star Codex: In a Mad World, All Blogging is Psychiatry Blogging (Feb 26, 2014) Reddit (archived from September 15, 2017).
  46. Slate Star Codex: Registry of bans (Created Feb 26, 2014) Reddit.
This article is issued from Rationalwiki. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.