I Spent a Year Writing Code with ChatGPT and Learned Nothing
In a world where we are judged by our results and not by what we learned along the way, AI-assisted coding seems like the perfect shortcut. But though we deliver our products faster, are we sure we're not losing something in the process? Something key to our own personal development?
Testing in Production
To be clear: I am not a developer. I'm just a guy who made it two and a half years into a Computer Science degree at UT Austin before red-black trees escorted me to the door and slammed it in my face. I retained enough to be dangerous at my summer job, but I was never going to survive algorithms, encryption, or graduate-level data structures. I was an enthusiastic tourist in CS, and eventually the tour had to end so I could study Shakespeare.
What I discovered in that window, though, was that I could build things that somehow worked. I wrote Java apps to get analytics for UT Austin's mail servers. I built an app to automate DACS cross-connections for remote learning across UT campuses. When that app needed an interface, I found my niche in programmatic back-ends and LAMP-based front-ends. And I've lived there comfortably ever since.
For nearly two decades at Uplogix and then Lantronix, I built and maintained a webapp called Central—a custom CRM that became the connective tissue of my team's daily work. When I left for my next job, I couldn't take Central with me, so I did the only logical thing: I built it again. Cleaner this time. Modern PHP practices, PDO prepared statements, and a proper layer of security. All the niceties I'd never had time to backport into Central over the years.
When I started building Atlas, I found the underlying tech had evolved, but my development cycle hadn't. I still coded the way I always had: brute force, testing in production, and making it work before making it pretty. In a few words, my process was slow as hell. What I needed, or thought I needed, was something to speed everything up.
A Scalpel, Not a Bulldozer
There's a way to use AI coding assistants that integrates directly with your IDE (or Sublime Text, in my case) where the AI can spin up multiple files, generate entire class structures, and scaffold an application from scratch. I tried it once. I watched it produce a small city of code I didn't write, couldn't understand, and definitely couldn't maintain. Hard no, Super Chief.
What I actually wanted was something more surgical. A block of code here. A function there. Something to hand me the answer without rewriting the entire page.
Even though Atlas is an internal tool used by a small team, I still wanted efficient and secure habits when it came to MySQL. ChatGPT was excellent at this. PDO prepared statements, upserts, complex JOINs. I described what I wanted, and it handed me working SQL ninety-nine percent of the time.
It could parse JSON better than I ever could. It understood API structures intuitively. It helped me build Jira automation, metrics calculations, and intelligent alerting in a short amount of time. The results were real. The paradigm-shifting synergy was palpable.
But somewhere in the middle of all that momentum, a small, uncomfortable thought started forming. The code worked. But did I understand it? I told myself not to worry, because the alternative was slowing down, and my team needed tools now.
The Friction is the Point
There's a specific anxiety that comes with gaps in my own knowledge, and I always feel like someone is going to jump out of the shadows to quiz me on how to extend Runnable. It reminds me of the last time I updated my resume and tried to list the things I was actually proficient in. PHP, yes. MySQL, sure. LAMP stack, you're damn right. Those answers came easy because I'd earned them through years of struggle.
After a year of ChatGPT-assisted development, I asked myself the same question. Do I understand PDO any better than I did twelve months ago? JSON? APIs? The answer, if I'm being honest, is not really. What I know how to do is describe a problem in plain language and receive a solution. That's a skill, I suppose. Whether it belongs on a resume next to "proficient in MySQL" is a question I'm glad I don't have to answer right now.
The loss of friction is the worst part. I'm not looking things up. I'm not doing trial and error. I'm not building the muscle memory that makes knowledge stick. And before anyone frames this as a coding-specific problem, it absolutely isn't. It's the same reason most writers abhor the idea of AI-generated prose. The first draft struggle, the revision, the wrong word replaced by the right one at 11pm... that's where the joy of writing lives.
AI doesn't write my books because writing is the part I love. And I'm starting to suspect that coding, even when it's slow, frustrating, and occasionally brain-melting, is the part I actually love at work.
I'm super proud of Atlas. It does everything I designed it to do and more. But the victory feels hollow, and I keep coming back to a question I can't shake: what does the steady removal of productive struggle do to us over time? And if the answer is something we should be worried about, why are we all running towards it so fast?
What's With the Stampede?
I'm not sure what happened to us, but somewhere along the way, we started trusting tech companies. I find that deeply embarrassing. How did they win us over so completely? Was it all the free stuff? Or was it the way they made us feel like we were getting free stuff while quietly turning us into the product?
When I look around and see every organization on the planet scrambling to not be the one left behind in the AI arms race, my instinct is sudden and absolute suspicion. Companies are pushing AI on their employees. They're making AI Proficiency a line item on job descriptions. We are hurtling toward full integration at a speed that suggests someone, somewhere, is very motivated for this to happen fast.
And I keep asking the same question: who is creating this worry?
If I were an AI company salesperson, my pitch would be simple: your competitors are already using this. You're going to be left behind. Then I'd smile and ask how many thousands of tokens they'd like to purchase. It's not a complicated sell. It's the same anxiety-driven stampede that gave us NFTs, except this time the technology actually does something, which makes it harder to dismiss out-of-hand.
And yet... I use AI every day.
That's the Gen X dilemma in a nutshell. I genuinely enjoy coding with Claude and ChatGPT. I find it useful in the small surgical ways I've described. But the moment someone tells me I have to use it, something in me pulls back hard. I don't want The Man's efficiency tools, even when The Man's efficiency tools are pretty good sometimes.
A Short Leash
I'm not giving up AI-assisted coding. That ship has sailed, and honestly, it's freeing me up to be a present manager for my team. That said, I'm keeping it on a short leash, and here's why: a tool that builds things you can't maintain isn't worth using.
Someone on my team mentioned offhand recently that I was "vibe coding" Atlas. I took quiet offense to that. Vibe coding—the practice of describing what you want and letting AI generate it wholesale while you destroy a can of Pringles—sounds a lot like cheating to me.
I know exactly what Atlas does. I know exactly what I want it to do. AI fills in the blanks. It handles the syntax I'd otherwise have to look up, the prepared statement I'd otherwise have to iterate seventeen times in production. But it doesn't write the story. It doesn't decide why the villain has a change of heart in the third act. That part is mine.
That's the only version of AI assistance I'm interested in: surgical, bounded, and always in service of a vision I already have. AI should augment the skills you bring to the table. It should help you get to your vision faster. The moment you're offloading the creation of that vision to a language model, you've stopped being the author of your own work.
At that point, we are no longer the music makers. We are no longer the dreamers of dreams.
Coming Up
AI has inserted itself firmly in the zeitgeist, and I've apparently got opinions about it, so expect to see more posts on the subject. Next up: AI outside the tech industry, and how everyday people, including my mom, are or aren't weaving it into their daily lives. Stay tuned.