Avatar

UnseriousAcademic

UnseriousAcademic@awful.systems
Joined
10 posts • 42 comments
Direct message

My hyper fixation for the last 4 years has been the band Lawrence. Eight-piece Soul Funk group with a brass section and two lead vocalists.

The musicianship is incredible. Saw them live last month and you could tell there was no click track as the band members improvised off each other and the crowd. They were having a genuinely good time on stage messing around and the energy was infectious. Genuinely the most fun I’ve had in years.

Also co-vocalist Gracie’s voice! I’ve heard their albums so many times and there’s still moments I find myself muttering blasphemy as she fucking belts it out.

As I get older my music tastes have definitely broadened from my relatively narrow range of Seattle Grunge and metal. Still with this band, my partner doesn’t quite know what’s happened to me.

Anyway, I recommend this live recording of Hip Replacement from last month.

permalink
report
reply

My most charitable interpretation of this is that he, like a lot of people, doesn’t understand AI in the slightest. He treated it like Google, asked for some of the most negative quotes from movie critics for past Coppola films and the AI hallucinated some for him.

If true it’s a great example of why AI is actually worse for information retrieval than a basic vector based search engine.

permalink
report
reply

Who could have predicted that a first principles ground up new Internet protocol based on monarchism would be a difficult sell.

*I mean, I think that’s what Urbit is. I’ve read multiple pieces describing it and I’m still not really clear.

permalink
report
reply

Forgot to say: yes AI generated slop is one key example, but often I’m also thinking of other tasks that are often presumed to be basic because humans can be trained to perform them with barely any conscious effort. Things like self-driving vehicles, production line work, call center work etc. Like the fact that full self drive requires supervision, often what happens with tech automation is that they create things that de-skill the role or perhaps speed it up, but still require humans in the middle to do things that are simple for us, but difficult to replicate computationally. Humans become the glue, slotted into all the points of friction and technical inadequacy, to keep the whole process running smoothly.

Unfortunately this usually leads to downward pressure on the wages of the humans and the expectation that they match the theoretical speed of the automation rather than recognise that the human is the the actual pace setter because without them the pace would be 0.

permalink
report
parent
reply

Based on my avid following of the Trashfuture podcast, I can authoritatively say that the “Hoon” programming language relies primarily on Australians doing sick burns and popping tyres in their Holden Commodores.

permalink
report
reply

Funnily enough that was the bit I wrote last just before hitting post on Substack. A kind of “what am I actually trying to say here?” moment. Sometimes I have to switch off the academic bit of my brain and just let myself say what I think to get to clarity. Glad it hit home.

Thanks for the link. I’m going to read that piece and have a look though the ensuing discussion.

permalink
report
parent
reply

Oh god it’s real? I saw pictures and there was a lot of “it’s AI” claims which I kind of hoped were true.

permalink
report
parent
reply

There’s definitely something to this narrowing of opportunities idea. To frame it in a real bare bones way, it’s people that frame the world in simplistic terms and then assume that their framing is the complete picture (because they’re super clever of course). Then if they try to address the problem with a “solution”, they simply address their abstraction of it and if successful in the market, actually make the abstraction the dominant form of it. However all the things they disregarded are either lost, or still there and undermining their solution.

It’s like taking a 3D problem, only seeing in 2D, implementing a 2D solution and then being surprised that it doesn’t seem to do what it should, or being confused by all these unexpected effects that are coming from the 3rd dimension.

Your comment about giving more grace also reminds me of work out there from legal scholars who argued that algorithmically implemented law doesn’t work because the law itself is designed to have a degree of interpretation and slack to it that rarely translates well to an “if x then y” model.

permalink
report
parent
reply

Oh no, the dangers of having people read your work!

It is coming, potentially in the next week. I was on leave for a couple of weeks and since back I’ve been finishing up a paper with my colleague on Neoreaction and ideological alignment between disparate groups. We should be submitting to the journal very soon so then I can get back to finishing off this series.

permalink
report
parent
reply

… Nope. In fact one of my in-laws said that they’d buy us an air frier for Christmas once the sales came. Everyone forgot about it shortly after and I don’t care one bit.

permalink
report
parent
reply