I feel like narrow AI tools duped me for a while, but the more I started to really use Chat GPT professionally, the more I’ve likened it to professional mimicking software. It essentially works to pyt out responses that sound the most convincing but have nothing to do with putting out responses that are actually at all accurate. These are terrible tools outside of asking basic questions, idea generation, and generally summarizing existing information you feed into it. I use it to help me make lists and better phrase emails and company messages at this point and nothing that actually requires any actual fact finding.
Professional bullshit artists, in the sense of the technical definition given by Harry Frankfurt in his influential book:
Frankfurt determines that bullshit is speech intended to persuade without regard for truth. The liar cares about the truth and attempts to hide it; the bullshitter doesn’t care whether what they say is true or false.
It’s a good troubleshooting tool. Pasting in weird error messages that don’t turn up any useful search results is pretty useful, even if the response it gives is partially inaccurate, it usually at least gives a bit more information than a search engine, which gives me more context to narrow my search terms and find a solution to the error.
It’s especially useful for learning Nix, since the online documentation is a bit shit and ChatGPT seems to have enough grasp on the Nix language and how to configure things in NixOS to tell me what I’m doing wrong.