You are viewing a single thread.
View all comments View context
3 points

GPT and the whole AI bs we have at the moment excels at being convincing. It’s even prepared to back up what it says.
The problem is, that all of that is generated. Not necessarily fact.
It will generate API methods, entire libraries, sources, legal cases, and science publications.
And it will be absolutely convincing as it presents and backs up those claims.

For example, GPT gives some API function of some library that magically solves your issue. Maybe you aren’t hugely familiar with the library, but you don’t trust GPT - so you research this made up API method and find the actual way to do it. Except you have GPT saying this exists and it works the way you want it to. So you research more, dig deeper.
Eventually you end up reading the source code, have a deeper understanding of the API in general and how to actually find useful answers (IE how to search query for it), and end up using the method you found while trying to find the mythical perfect API method.
I mean, I guess that’s a win? You learned some documentation, you solved the problem… Who cares?

Maybe I’m just bitter because that was how I first tried any of the new AI things. And I wasted 2-3 hours instead of actually solving the fucking problem by consulting the facts.

permalink
report
parent
reply

TechTakes

!techtakes@awful.systems

Create post

Big brain tech dude got yet another clueless take over at HackerNews etc? Here’s the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

Community stats

  • 1.5K

    Monthly active users

  • 418

    Posts

  • 11K

    Comments

Community moderators