Avatar

Sailor Sega Saturn

sailor_sega_saturn@awful.systems
Joined
2 posts • 292 comments

I am the journeyer from the valley of the dead Sega consoles. With the blessings of Sega Saturn, the gaming system of destruction, I am the Scout of Silence… Sailor Saturn.

Direct message

Within 5 minutes of my first hike, the trees smiled at me and whispered their simple wisdom.

This probably only sounds profound to people who haven’t been outside in 7 years. Don’t get me wrong hiking is good for the soul. But if it hits you that hard after five minutes you’re probably terminally online.

Also why can’t trees have complex wisdom gosh darn it?

permalink
report
reply

Great no problem I’ll just read through the sequences and be all caught up!

permalink
report
parent
reply

Hopefully 2025 will be a nice normal year–

Cybertruck outside of Trump hotel explodes violently and no once can figure out if it was a bomb or just Cybertruck engineering

Huh. I guess it’ll be another weird one.

(I know I know, low effort post, I’m sick in bed and bored)

permalink
report
reply

Once a month or so Awful Systems casually mentions a racist in some sub-sub-culture who I had never heard about before and then I get to spend an hour doing background research on obscure net drama from 2013 or whatever.

permalink
report
parent
reply

Open Phil generally seems to be avoiding funding anything that might have unacceptable reputational costs for Dustin Moskovitz

“reputational cost” eh? Let’s see Mr. Moskovitz’s reasoning in his own words:

Spoiler - It's not just about PR risk

But I do want agency over our grants. As much as the whole debate has been framed (by everyone else) as reputation risk, I care about where I believe my responsibility lies, and where the money comes from has mattered. I don’t want to wake up anymore to somebody I personally loathe getting platformed only to discover I paid for the platform. That fact matters to me.

I cannot control what the EA community chooses for itself norm-wise, but I can control whether I fuel it.

I’ve long taken for granted that I am not going to live in integrity with your values and the actions you think are best for the world. I’m only trying to get back into integrity with my own.

If you look at my comments here and in my post, I’ve elaborated on other issues quite a few times and people keep ignoring those comments and projecting “PR risk” on to everything. I feel incapable of being heard correctly at this point, so I guess it was a mistake to speak up at all and I’m going to stop now. [Sorry I got frustrated; everyone is trying their best to do the most good here] I would appreciate if people did not paraphrase me from these comments and instead used actual quotes.

again, beyond “reputational risks”, which narrows the mind too much on what is going on here

“PR risk” is an unnecessarily narrow mental frame for why we’re focusing.

I guess “we’re too racist and weird for even a Facebook exec” doesn’t have quite the same ring to it though.

permalink
report
reply

Yes but if I donate to Lightcone I can get a T-shirt for $1000! A special edition T-shirt! Whereas if I donated $1000 to Archive Of Our Own all I’d get is… a full sized cotton blanket, a mug, a tote bag and a mystery gift.

permalink
report
parent
reply

Holy smokes that’s a lot of words. From their own post it sounds like they massively over-leveraged and have no more sugar daddies so now their convention center is doomed (yearly 1 million dollar interest payments!); but they can’t admit that so are desperately trying to delay the inevitable.

Also don’t miss this promise from the middle:

Concretely, one of the top projects I want to work on is building AI-driven tools for research and reasoning and communication, integrated into LessWrong and the AI Alignment Forum. […] Building an LLM-based editor. […] AI prompts and tutors as a content type on LW

It’s like an anti-donation message. “Hey if you donate to me I’ll fill your forum with digital noise!”

permalink
report
reply

Days since last comparison of Chat-GPT to shitty university student: zero

More broadly I think it makes more sense to view LLMs as an advanced rubber ducking tool - like a broadly knowledgeable undergrad you can bounce ideas off to help refine your thinking, but whom you should always fact check because they can often be confidently wrong.

Seriously why does everyone like this analogy?

permalink
report
parent
reply

Debating post-truth weirdos for large sums of money may seem like a good business idea at first, until you realize how insufferable the debate format is (and how no one normal would judge such a thing).

permalink
report
parent
reply