If I see someone boasting about programming with AI, in 0.1% of cases they use it responsibly (as a tool to quickly get introduced into a topic and brainstorm ideas) and the rest of times they’re probably a script kiddie letting ChatGPT do advent of code or smth and calling themselves programmers.
Same thing with all the folks who took the “copy pasting from stackoverflow” joke literally.
I regularly have to find guidance online through code examples, but you need to understand what the code you’ve found actually does under the hood for when it inevitably has issues because it wasn’t made for your specifc use case.
Do people actually copy and paste code with no understanding of how it works, from SO or Copilot? I always thought this was just a joke.
I feel like there is almost no chance of a copilot program working as expected without having an understanding of the code it writes. It makes some hilariously bad choices at times, and frequently drops and changes code that was added previously.
As someone who has often been asked for help or advice by other programmers, I know with 100% certainty that I went to university and worked professionally with people who did this, for real.
“Hey, can you take a look at my code and help me find this bug?”
(Finding a chunk of code that has a sudden style-shift) “What is this section doing?”
“Oh that’s doing XYZ.”
“How does it work?”
“It calculates XYZ and (does whatever with the result).”
(Continuing to read and seeing that it actually doesn’t appear to do that) “Yes, but how is it calculating XYZ?”
“I’m not 100% sure. I found it in the textbook/this ‘teach yourself’ book/on the PQR website.”
Skilled in asking a chatbot how to job.
Fortunately the company execs aren’t the ones doing the hiring, and if they are then you probably don’t want that job!