istewart
It’s remarkable to me how far and how rapidly this guy swerved outside of his initial lane, all while having absolutely terrible voice and diction for being a long-form interviewer. He’s worked on that, but it’s clear that his initial success was based off of targeting high-level professionals who otherwise wouldn’t very often be sought out for the type of interviews Lex does. I’m thinking of guys like Jim Keller and Chris Lattner, who would probably only make such public appearances in the form of keynotes at conferences for their specific niches.
But you can’t convince me that you’re really the world’s best technical interviewer if you’re also uncritically sitting down with Donald fucking Trump, or deciding that you’re suddenly enough of a historian to take on Gibbon with your fucking podcast. Who’s financing this guy, anyway? Is MIT actually kicking him cash, or is it just an RMS scenario where they give him space because they’re concerned about where he might end up otherwise?
Despite the industry’s deeply ingrained neophilia, I think it speaks to the importance of backwards compatibility and legacy systems.
I can’t help but think that the genAI craze will end up being a regrettable side-quest along the path to “coding for non-programmers” akin to Visual Basic. But hey, I bet there’s a lot more legacy VB apps being kept alive out there than anyone would be comfortable with.
I sharply disagree, but this is a subtlety that’s lost on a lot of people. The tech industry’s success since at least the 1990s, up until the mid-2010s, was about making technology easier for the individual user, a more accessible and (potentially) more efficient means for accomplishing many routine interactions. Tech devices existed as tools in service of the will of the end user, and if you were really willing to drink the kool-aid, extensions of the user themselves, Jobs’ “bicycle for the mind.”
The expectations being cultivated for AI now set it up as an entirely separate entity from the end user, and one that is potentially more capable at some point in the ill-defined future. This opens the door toward resources being reallocated towards this nebulously powerful entity, and the allocation of shared resources is at the very core of politics. This is a hard pivot away from how technology was designed before! You and I know it’s a load of complete hogwash, but that doesn’t prevent the potential bamboozlement of the lagging generation of policy-makers. Even someone as relatively young as Kamala Harris or her likely successor Gavin Newsom could be roped into this bullshit, if only because they know where their biggest donation checks come from.
The future in which the current crop of AI retailers enjoy a successful political program is no longer one where a rising tide lifts all boats. But, for the time being, it can still be pitched as such due to deeply embedded cultural expectations.