Over half of all tech industry workers view AI as overrated::undefined
Best assessment I’ve heard: Current AI is an aggressive autocomplete.
No. It’s not and hasn’t been for at least a year. Maybe the ai your dealing with is, but it’s shown understanding of concepts in ways that make no sense for how it was created. Gotta go.
Over half of tech industry workers have seen the “great demo -> overhyped bullshit” cycle before.
Once we’re able to synergize the increased throughput of our knowledge capacity we’re likely to exceed shareholder expectation and increase returns company wide so employee defecation won’t be throttled by our ability to process sanity.
Sounds like we need to align on triple underscoring the double-bottom line for all stakeholders. Let’s hammer a steak in the ground here and craft a narrative that drives contingency through the process space for F24 while synthesising synergy from a cloudshaping standooint in a parallel tranche. This journey is really all about the art of the possible after all so lift and shift a fit for purpose best practice and hit the ground running on our BHAG.
It’s not overrated.
Using the “Mistral instruct” model to select your food in the canteen works like a charm.
Just provide it with the daily option, tell it to select one main-, side dish and a dessert and explain the selection. Never let me down. Consistently selects the healthier option that still tastes good.
On one hard there’s the emergence of the best chat bot we’ve ever created. Neat, I guess.
On the other hand, there’s VC capital scurrying around for the next big thing to invest in, lazy journalism looking for a source of new content to write about, talentless middle management looking for something to latch on to so they can justify their existence through cost cutting, and FOMO from people who don’t understand that it’s just a fancy chat bot.
Largely because we understand that what they’re calling “AI” isn’t AI.
This is a growing pet peeve of mine. If and when actual AI becomes a thing, it’ll be a major turning point for humanity comparable to things like harnessing fire or electricity.
…and most people will be confused as fuck. “We’ve had this for years, what’s the big deal?” -_-
I also believe that will happen! We will not be prepared since many don’t understand the differences between what current models do and what an actual general AI could potentially do.
It also saddens me that many don’t know or ignore how fundamental abstract reasoning is to our understanding of how human intelligence works. And that LLMs simply aren’t intelligent in that sense (or at all, if you take a tight definition of intelligence).
I don’t get how recognizing a pattern is not AI. It recognizes patterns in data, and patterns in side of patterns, and does so at a massive scale. Humans are no different, we find patterns and make predictions on what to do next.
The decision tree my company uses to deny customer claims is not AI despite the business constantly referring to it as such.
There’s definitely a ton of “AI” that is nothing more than an If/Else statement.
That’s basically what video game AI is, and we’re happy enough to call it that