The above gif was exactly me, in that moment when, after looking at a codebase for days, you finally realize that it was mostly written with ChatGPT 🤦♂️
Not sure about you all, but I've already started seeing it in commercial usage as well, and I think it sucks for everyone.
The same as the avalanche of AI-created 💩 content here on LinkedIn makes the experience worse for everyone.
<rant on why it's a bad idea to use AI code generators coming up>
Call me a "traditionalist", but I'm yet to see any real proof that using Copilot-like tools doesn't lead to a general dumbing-down of developers. What I do see and experience is anecdotal and first-hand evidence showing the real waste and additional problems that using them brings.
Unless you REALLY are a very experienced programmer, who's done the exact same thing that you want ChatGPT/Copilot to write for you at least 10 times before (with the code going to production and you being certain it's good), then please consider refraining from using these tools.
Otherwise I believe that yourself, and everyone else in your team and your company, will likely be worse off in the long run.
Not to mention that we're all lazy by nature, so when you have AI write the code for you, you naturally stop thinking on whether it makes sense or you're missing something more fundamental.
Either your future self, or others in your team, will waste countless hours having to debug and re-write your 💩 auto-generate snippets.
And please remember that writing code is less than 10% of all time spent by a programmer.
What is easy, quick and pleasurable is usually also not great for you long term. Think eating lots of refined sugars and binge watching series on Netflix.
If you are amongst the very few who can actually get a productivity improvement from using ChatGPT or Copilot, you'll know it.
Even then, spend some time and really think hard if you want to build such a habit, for your own mind and your future career in the space. After a certain level of "seniority" nobody pays you for the time you spent - or saved - writing code.
They pay you to develop architectures that work and scale, and to solve really hard problems that involve deeply hidden intricacies of the language or design patterns.
Copilot and ChatGPT can't really help you in any of these areas, and they can actually harm your future growth potential by giving you the illusion that great code can "magically" come out of your simple prompts. After a certain level nothing is simple anymore, and you won't be able to prompt any AI for an immediate solution.
And, if you REALLY want to be more "productive" for those 10% of time when you write code, learn vim motions and master a keyboard-only workflow. That alone will give you at least 2x improvement.
Using your own brain, creatively thinking through problems in pseudo code and on whiteboards/paper/schema tools, then also writing code by hand, should be the de-facto approach for the vast majority of developers on this planet.
I'm not a purist in the sense of saying "AI bad, use fingers". Rather I'm arguing for not using it to write core pieces of logic and other key components, and not using it to compensate for your own lack of education/understanding as a developer.
Overall the main issue I see is that it's being used in a fumbling way by fumbling devs, to the point where they don't really understand why Copilot generated the code that way.
So they go along with it and if it works they push to production, giving up agency to the AI and secretly hoping that "it knows what it's doing".
This is a certain recipe for disaster, and I'm convinced it's happening right now across many industries which heavily implement AIs and where human verification is essential for correctness.
The future will be very interesting 🥺