1 point
*

I predict that, within the year, AI will be doing 100% of the development work that isn’t total and utter bullshit pain-in-the-ass complexity, layered on obfuscations, composed of needlessly complex bullshit.

That’s right, within a year, AI will be doing .001% of programming tasks.

permalink
report
reply
1 point

Can we just get it to attend meetings for us?

permalink
report
parent
reply
1 point

Engineering is about trust. In all other and generally more formalized engineering disciplines, the actual job of an engineer is to provide confidence that something works. Software engineering may employ fewer people because the tools are better and make people much more productive, but until everyone else trusts the computer more, the job will exist.

If the world trusts AI over engineers then the fact that you don’t have a job will be moot.

permalink
report
reply
-1 points

People don’t have anywhere near enough knowledge of how things work to make their choices based on trust. People aren’t getting on the subway because they trust the engineers did a good job; they’re doing it because it’s what they can afford and they need to get to work.

Similarly, people aren’t using Reddit or Adobe or choosing their cars firmware based on trust. People choose what is affordable and convenient.

permalink
report
parent
reply
1 point

In civil engineering public works are certified by an engineer; its literally them saying if this fails i am at fault. The public is trusting the engineer to say its safe.

permalink
report
parent
reply
1 point

Yeah, people may not know that the subway is safe because of engineering practices, but if there was a major malfunction, potentially involving injuries or loss of life, every other day, they would know, and I’m sure they would think twice about using it.

permalink
report
parent
reply
1 point

ChatGPT is hilariously incompetent… but on a serious note, I still firmly reject tools like copilot outside demos and the like because they drastically reduce code quality for short term acceleration. That’s a terrible trade-off in terms of cost.

permalink
report
reply
1 point

I enjoy using copilot, but it is not made to think for you. It’s a better autocomplete, but don’t ever let it do more than a line at once.

permalink
report
parent
reply
0 points

Biggest problem with it is that it lies with the exact same confidence it tells the truth. Or, put another way, it’s confidently incorrect as often as it is confidently correct - and there’s no way to tell the difference unless you already know the answer.

permalink
report
parent
reply
0 points

it’s kinda hilarious to me because one of the FIRST things ai researchers did was get models to identify things and output answers together with the confidence of each potential ID, and now we’ve somehow regressed back from that point

permalink
report
parent
reply

did we really regress back from that?

i mean giving a confidence for recognizing a certain object in a picture is relatively straightforward.

But LLMs put together words by their likeliness of belonging together under your input (terribly oversimplified).the confidence behind that has no direct relation to how likely the statements made are true. I remember an example where someone made chatgpt say that 2+2 equals 5 because his wife said so. So chatgpt was confident that something is right when the wife says it, simply because it thinks these words to belong together.

permalink
report
parent
reply
0 points

they drastically reduce … quality for short term acceleration

Western society is built on this principle

permalink
report
parent
reply
0 points

Tell me about it…

I left my more mature company for a startup.

I feel like Tyler Durden sometimes.

permalink
report
parent
reply
0 points

How you liking it? How many years have you aged in the months working at your startup?

permalink
report
parent
reply
0 points
*

I’m still convinced that GitHub copilot is actively violating copyleft licenses. If not in word, then in the spirit.

permalink
report
parent
reply
0 points

On a more serious note, ChatGPT, ironically, does suck at webdev frontend. The one task that pretty much everyone agrees could be done by a monkey (given enough time) is the one it doesn’t understand at all.

permalink
report
reply
0 points
*

I don’t think it’s very useful at generating good code or answering anything about most libraries, but I’ve found it to be helpful answering specific JS/TS questions.

The MDN version is also pretty great too. I’ve never done a Firefox extension before and MDN Plus was surprisingly helpful at explaining the limitations on mobile. Only downside is it’s limited to 5 free prompts/day.

permalink
report
parent
reply
0 points

Chat gpt is also great if you have problems with Linux. It is my nr 1 trouble shooting tool.

permalink
report
parent
reply
0 points

The only thing ChatGPT etc. is useful for, in every language, is to get ideas on how to solve a problem, in an area you don’t know anything about.

ChatGPT, how can I do xy in C++?
You can use the library ab, like …

That’s where I usually search for the library and check the docs if it’s actually possible to do it this way. And often, it’s not.

permalink
report
reply

Programmer Humor

!programmerhumor@lemmy.ml

Create post

Post funny things about programming here! (Or just rant about your favourite programming language.)

Rules:

  • Posts must be relevant to programming, programmers, or computer science.
  • No NSFW content.
  • Jokes must be in good taste. No hate speech, bigotry, etc.

Community stats

  • 6.7K

    Monthly active users

  • 796

    Posts

  • 7.3K

    Comments

Community moderators