Soyweiser
‘i am a stochastic parrot and so are u’
reminds me of
“In his desperation to have produced reality through computation, he denigrates actual reality by equating it to computation”
(from this review/analysis of the devs series). A pattern annoying common among the LLM AI fans.
E: Wow, I did not like the reactionary great man theory spin this article took there. Don’t think replacing the Altmans with Yarvins would be a big solution there. (At least that is how the NRx people would read this article). Quite a lot of the ‘we need more well read renaissance men’ people turned into hardcore trump supporters (and racists, and sexists and…). (Note this edit is after I already got 45 upvotes).
BRING IT ON NITPICKY NUKE NERDS
Well acthtually we prefer to be called fission/fusion nerds
“greenwashing is cheaper than action” indeed. (edit2) On that note, storytime about the clownshow that is Dutch politics. So our radical right wing government is pro nuclear power, of course, and they want to build more powerplants. So what are they planning on doing? They are going to start a study on which locations are best. Which is maddening, as these studies have already been done before (so it prob is just an attempt to hopefully have the study finish when it isn’t them in power anymore so they are not at risk of starting an too expensive megaproject). But it gets worse, the absolute clowns of our farmers party just went ‘fuck the studies’ and they just pointed at a province where there are a lot of farmers and went ‘we will put a powerplant there’. And this is how they discovered nuclear powerplants need running water and they picked one of the areas without a major river. ('im ignoring the clownshow re ‘the immigration crisis’ (not a crisis) as this post is already too long, and there is a big risk of honk overdose if I go into that).
I have joked before how people really into stoicism tend to be quite emotional and even a risky, as stoicism always seems to be aspirational and doesnt describe the stoic fans behaviour (a good example is the yter Sargon), but this might be a bit of an extreme example.
I’m afraid for that, as I fear you would see a lot of laughing people. The emperor knows he is naked, but he still is emperor and he sees the people pretend he has clothes. The people of the court also know the emperor is naked, but he is the emperor and they will go along with his wishes. What are they going to do? Quit?
brb, time to go shout ‘fucking nazis tenants’ at my local library.
Yes we could just shoot the severs, but what if the AI develops an anti-bullet shield, and then we shoot it with anti-bullet shield bullets, and then it creates an anti-bullet shield bullet bullet shield, and then, … and then …
Anyway, those kinds of kids reality free, imagination games of move and counter move were pretty cool when you were 8 years old.
Sorry got distracted a bit and just wanted to share, not related to the topic at hand.
I know im a big nerd but "Musk also presented the Optimus robot, your plastic pal who’s fun to be with — “your own personal R2-D2, C-3PO!”.
Why would I need an astromech droid Musk? At least a GNK droid provides power. Stupid sexy gonk droid.
Considering that the idea of the singularity of AGI was the exponential function going straight up, I don’t think this persons understands the problem. Lol, LMAO foomed the scorpion.
(Also that is some gross weird eugenics shit).
E: also isn’t IQ a number that gets regraded every now and then with an common upper bound of 160? I know the whole post is more intended as vaguely eugenics aspirational but still.
Anyway, time to start the lucrative field of HighIQHuman safety research. What do we do if the eugenics superhumans goals don’t align with humanity?
Wait, as AI learns the same way as humans, and humans are thus basically AI does that mean I can just pirate everything? This will change… nothing really.