think I forgot this one
I feel like generative AI is an indicator of a broader pattern of innovation in stagnation (shower thoughts here, I’m not bringing sources to this game).
I was just a little while ago wondering if there is an argument to be made that the innovations of the post-war period were far more radically and beneficially transformative to most people. Stuff like accessible dishwashers, home tools, better home refrigeration etc. I feel like now tech is just here to make things worse. I can’t think of any upcoming or recent home tech product that I’m remotely excited about.
… Nope. In fact one of my in-laws said that they’d buy us an air frier for Christmas once the sales came. Everyone forgot about it shortly after and I don’t care one bit.
(annoying air fryer owner voice) we have a 25 litre air fryer and it’s awesome, just a nice little countertop oven, and [METEOR FALLS ON LONDON]
Most of the stuff these days is behind the scenes, like clean energy, innovative water reclamation, etc. it’s life changing but we don’t really see it every day.
In my opinion we should put cars away in urban areas and go to e-bikes/rickshaws. That would be both transformative and an improvement.
Also I think it’s more of a constant stream of small incremental changes. Things like GPS, the internet, lithium batteries, etc. They have all been things that are enabling a lot of other innovation, but have been rolled out more continously.
Just looking at battery tech, things like smart phones, drones, EVs all wouldn’t be possible without them and each of those have gone through massive innovations cycles themselves.
I think there’s definitely something to be said for the exhaustion of low-hanging fruit. Most of those big consumer innovations were either the application of novel physics or chemistry (refrigerants, synthetics, plastics, microwaves, etc) combined with automating very labor-intensive but relatively simple tasks (dish washing, laundry, manual screwdriving, etc). The digital age added some very powerful logic to that toolset, but still remains primarily limited to the kinds of activities and processes that can be defined algorithmically. The ingenuity of software developers along with the introduction of new tools and peripheral capabilities (printers, networks, sensors) have shown that the kind of problems that can be defined algorithmically is a much larger set than you would first think, but it’s still limited.
Adding on to this, it’s worth noting the degree to which defining problems algorithmically requires altering the parameters of that problem. For example, compare shopping at a store with using a vending machine. The vending machine dramatically changes the scope of the activity by limiting the variety of items you can get, only allowing one item per transaction, preventing you from examining the goods before purchasing, and so on. The high-level process is the same; I move from having no soda and some dollars to one soda and less dollars. But the changes that are made to ensure the procedure can be mechanized have some significant social tradeoffs. Each transaction has less friction, but also less potential. These consequences are even more pronounced if your point of comparison is an old-school sofa fountain where “hanging out waiting for the soda jerk and drinking together” is largely the whole point and while that activity requires more from you it also gives more opportunities to interact with and meet people and to see friends outside of work or school. Even if you don’t want to spend the time or be social (or even like me get severe social anxiety sometimes!) this still leads to a world where there are more and larger blocks of time that you can’t be expected to trade away to your job or other obligations. Your boss is likely to fire you for being late to work, unless that tardiness comes from the ferry you and your coworkers rely on being late. Because it’s inevitable friction in a necessary part of working (can’t work if you can’t get to work) and because it can’t be put entirely on the individual (even if you do want to blame the employee for taking the "wrong* boat so you really want to fire the whole team?) the system is basically forced to give you more grace than it otherwise would want to.
This is another way to frame the problems with more recent “innovations” - while social media and the gig economy both arguably empower individual consumers and producers of both cultural output and of services like taxis, they do so in ways that fundamentally change the relationship and individualize the connections between consumers, producers, and the system that they interact through. And because nobody has as direct a connection to the owners and operators of that system, they have more power to increase their profits at the expense of everyone who actually has to use the system to function.
There’s definitely something to this narrowing of opportunities idea. To frame it in a real bare bones way, it’s people that frame the world in simplistic terms and then assume that their framing is the complete picture (because they’re super clever of course). Then if they try to address the problem with a “solution”, they simply address their abstraction of it and if successful in the market, actually make the abstraction the dominant form of it. However all the things they disregarded are either lost, or still there and undermining their solution.
It’s like taking a 3D problem, only seeing in 2D, implementing a 2D solution and then being surprised that it doesn’t seem to do what it should, or being confused by all these unexpected effects that are coming from the 3rd dimension.
Your comment about giving more grace also reminds me of work out there from legal scholars who argued that algorithmically implemented law doesn’t work because the law itself is designed to have a degree of interpretation and slack to it that rarely translates well to an “if x then y” model.
I’ve thought about a similar idea before in the more minor context of stuff like note-taking apps – when you’re taking notes in a paper notebook, you can take notes in whatever format you want, you can add little pictures or diagrams or whatever, arranged however you want. Heck, you can write sheet music notation. When you’re taking notes in an app, you can basically just write paragraphs of text, or bullet points, and maybe add pictures in some limited predefined locations if you’re lucky.
Obviously you get some advantages in exchange for the restrictive format (you can sync/back up things to the internet! you can search through your notes! etc) but it’s by no means a strict upgrade, it’s more of a tradeoff with advantages and disadvantages. I think we tend to frame technological solutions like this as though they were strict upgrades, and often we aren’t so willing to look at what is being lost in the tradeoff.
In reading about this I’ve seen some interesting concepts from scraping the edges of management cybernetics, focusing on organizations kind of like analogue information-processing systems. The one that really stuck in my mind is the accountability sink, am organizational function that takes the responsibility for some action or decision away from the people in the organization who actually do it and places it somewhere more abstract, like a process or a policy. This ties in to a lot of what we talk about here, since a lot of the tech industry these days seems to be about centralizing things around a few major platforms and giving the people who run those platforms as many accountability sinks as they can come up with, with AI being the newest.
A lot of the tech “innovation” is actually VC “innovation” and is meant to dismantle the safety nets of the working class. Literally half of their disruption is "we’ll finance you to lose money until you’ve ruined all competition, and then you can price gouge everyone while your “contractors” don’t get a decent salary, a retirement fund or any kind of insurance.