https://nonesense.substack.com/p/lesswrong-house-style

Given that they are imbeciles given, occasionally, to dangerous ideas, I think it’s worth taking a moment now and then to beat them up. This is another such moment.

19 points

This is obviously insane, the correct conclusion is that learning models cannot in fact be trained so hard that they will always get the next token correct. This is provable, and it’s not even hard to prove. It’s intuitively obvious, and a burly argument that backs the intuition is easy to build.

You do, however, have to approach it through analogies, through toy models. When you insist on thinking about the whole thing at once, you wind up essentially just saying things that feel right, things that are appealing. You can’t actually reason about the damned thing at all.

this goes a long way towards explaining why computer pseudoscience — like a fundamental ignorance of algorithmic efficiency and the implications of the halting problem — is so common and even celebrated among lesswrongers and other TESCREALs who should theoretically know better

permalink
report
reply
9 points
*

I am reminded of this excellent essay that I saved a while back: "“your brain does not process information and it is not a computer”

permalink
report
parent
reply
4 points

I’m out of the loop: what is lesswrong and why is it cringe?

permalink
report
parent
reply
16 points
*

It’s complicated.

It’s basically a forum created to venerate the works and ideas of that guy who in the first wave of LLM hype had an editorial published in TIME where he called for a worldwide moratorium on AI research and GPU sales to be enforced with unilateral airstrikes, and whose core audience got there by being groomed by one the most obnoxious Harry Potter fanfictions ever written, by said guy.

Their function these days tends to be to provide an ideological backbone of bad scifi justifications to deregulation and the billionaire takeover of the state, which among other things has made them hugely influential in the AI space.

They are also communicating vessels with Effective Altruism.

If this piques your interest check the links on the sidecard.

permalink
report
parent
reply
4 points

They are also communicating vessels with Effective Altruism.

I have a basic understanding of what EA is but what do you mean by communicating vessels?

permalink
report
parent
reply
13 points

Rationalwiki (not affiliated with LW Rationalists, the opposite actually, op is a mod there) has a page on it. https://rationalwiki.org/wiki/Less_wrong

permalink
report
parent
reply
7 points

That sounds like a religion insisting it isn’t one

permalink
report
parent
reply
5 points

Ok rationalwiki actually seems like a really useful resource for reading up on which sexy new movements are bullshit and which aren’t

permalink
report
parent
reply
4 points

Ah thanks!

permalink
report
parent
reply
10 points

They’re Basically fanboys of whatever the latest cult is coming out of silicon valley.

permalink
report
parent
reply
15 points

This is an interesting companion to that other essay castigating Rationalist prose, Elizabeth Sandifer’s The Beigeness. The current LW style indulges in straight-up obscurantism and technobabble, which is probably better at hiding how dumb the underlying argument is and cloaking unsupported assertions as meaningful arguments. It also doesn’t require you to be as widely-read as our favorite philosophy major turned psychiatrist turned cryptoreactionary, since you’re not switching contexts every time it starts becoming apparent that you’re arguing for something dumb and/or racist.

permalink
report
reply
9 points
*

This has always been the case. I think I first stumbled across less wrong in the early two thousands when I was a maths undergrad.

At this point it was mostly Eliezer writing extremely long blog posts about Bayesian thinking, and my take home was just, wow these guys are really bad at maths.

A good mathematician will carefully select the right level of abstraction to make what they’re saying as simple as possible. Less wrong has always done the complete opposite, everything is full of junk details and needless complexity, in order to make it feel harder than it really is

Basically, Eliezer needs an editor, and everyone who copies his style needs one too.

permalink
report
parent
reply
6 points

Oh, nice! I stumbled across this essay ages ago and misplaced it due to forgetting to bookmark it. Thanks for bringing it back to my attention.

It is quite a beautiful thing to see Scott Alexander’s beige technobabble eviscerated by such vibrant and incisive prose.

permalink
report
parent
reply
11 points

Such a good post. LWers are either incapable of critical thought or self scrutiny, or are unwilling and think verbal diarrhea is a better choice.

permalink
report
reply
11 points

It’s an ironic tragedy that the average LWer claims to value critical thought far more than most people do, and this causes them to do themselves a disservice by sheltering in an echo chamber. Thinking of themselves as both smart and special helps them to make sense of the world and their relative powerlessness as an individual (“no, it’s the children who are wrong” meme.jpeg). Their bloviating is how they main the illusion.

I feel comfortable speculating because in another world, I’d be one of them. I was a smart kid, and building my entire identity around that meant I grew into a cripplingly insecure adult. When I wrote, I would meander and over-hedge my position because I didn’t feel confident in what I had to say; Post-graduate study was especially hard for me because it required finding what I had to say on a matter and backing myself on it. I’m still prone to waffling, but I’m working on it.

The LW excerpts that are critiqued in the OP are so sad to me because I can feel the potential of some interesting ideas beneath all the unnecessary technobabble. Unfortunately, we don’t get to see that potential, because dressing up crude ideas for a performance isn’t conducive to the kinds of discussions that help ideas grow.

permalink
report
parent
reply
6 points

In the Going Clear documentary an author says that because Scientology was built by and for L. Ron Hubbard, people who follow Scientology are gradually moulded in his image and pick up his worst traits and neuroses. LessWrong was founded by a former child prodigy…

permalink
report
parent
reply
6 points

…with a huge chip on his shoulder about how the system caters primarily to normies instead of specifically to him, thinks he has fat-no-matter-what genes and is really into rape play.

permalink
report
parent
reply

SneerClub

!sneerclub@awful.systems

Create post

Hurling ordure at the TREACLES, especially those closely related to LessWrong.

AI-Industrial-Complex grift is fine as long as it sufficiently relates to the AI doom from the TREACLES. (Though TechTakes may be more suitable.)

This is sneer club, not debate club. Unless it’s amusing debate.

[Especially don’t debate the race scientists, if any sneak in - we ban and delete them as unsuitable for the server.]

Community stats

  • 362

    Monthly active users

  • 161

    Posts

  • 2.5K

    Comments