2025-07-04
I am not a user of AI assistant tools when I code. This is partially due to me being a “laggard” when it comes to using anything AI, but also because of my particular journey into writing software.
I first became interested in writing code to make video games while in high school circa 2010, at a time when, of course, AI tools were far away. I worked for about 7 years in the technology field, from 2015 to 2022.
2020 was when general-purpose AI and especially generative AI started to go mainstream. I remember being surprised with this blog post: https://maraoz.com/openai-gpt3/. Looking back it’s still impressive, though indicative of some of the issues with AI, namely that everything in the article is made up… though some of it appears true, like the summary of OpenAI in the first paragraph, all the stuff about posting on bitcointalk.org is totally made up.
Anyways, just as I started to leave the technology field in 2022, generative AI was going super mainstream, and AI coding assistance started popping up everywhere. Simultaneously, I was transitioning from being a professional coder to a hobbyist coder - most of the technology work I’ve done since 2022 has been personal projects like this website. So I basically had no pressure to “perform” - I sought out tools that would make the process of coding more enjoyable, rather than attempting to improve the output.
And here’s the thing. When I was working full time as a software engineer for a startup, there was a huge pressure to output. So much that I developed carpal tunnel in my wrists from typing hundreds if not thousands of lines of code a day (though in retrospect there were some easy improvements to ergonomics that took me years to implement). So in that role I’d definitely have attempted to use an AI assistant to write those thousands of lines of code. Hell, I even tried implementing a natural language coding interface - using my voice to code (it didn’t work).
But yeah anyways I haven’t felt the need to implement AI into my development workflow. I suspect that over time, as the new generations are brought up to use code assistants, people may forget how people wrote code before AI assistants.
So here’s how I’ve always done it.
Ok so say you want to make something. Maybe an app, maybe a website, maybe a game.
Much in the same way that Carl Sagan said “If you wish to make an apple pie from scratch, you must first invent the universe.”, if you want to make any application, you’re not going to start with building your own semiconductors from sand.
Given we’re not going to start from scratch, there are basically 2 methods (that converge anyways):
The starting point you choose will profoundly affect the route you take and what you end up with. So choosing the right tool - framing the problem - is very important. If you’re a beginner and don’t know what to choose, ask somebody who knows more than you, or pick something and see how it goes, then pick something else if you’re not progressing as you expect. Just don’t give up as soon as you get stuck - your role as a software developer is to get to that sticking point and figure out a path forward. If you give up and switch frameworks/languages/templates, you might be missing the underlying context, and keep getting stuck at that point until you take time to understand the fundamental problem.
An example of forking something existing is using web developer tools to view source on a website, and copying its html and css. Hopefully you’ll have your own unique content, otherwise you’re just plagiarizing, but realistically you don’t need need to innovate on the technology of most projects. Want to make a blog? Look at how a blog you follow is published.
An example of building off of a framework is opening the Unity documentation (or “Manual”) and following their instructions to create your own first project.
I say these converge because by the end of the tutorial, you’ll have a fork of something existing, namely their tutorial application.
Once you have a project that you’ve copied or built from a tutorial, then you tinker around with it.
If you get an error you don’t understand, copy the error message verbatim into a web browser and you’ll probably end up with a Stack Overflow question and answer related to your issue.
Once you start wrapping your head around the framework you’re looking at, you can start to deconstruct the framework. If it’s open source, you can even read the source of the framework itself, and hopefully modify the framework. Or create your own framework, for that level of the abstraction.
I can’t seem to find a nice graphic for this, so here’s a list representation of a hierarchy of computing:
Here’s an example of each, from the reference point of looking at a web browser on a Mac:
As a computer user, you mostly deal with the first 3 levels: opening an application, changing its state, installing and uninstalling applications. As for myself I’ve made a career out of the first 5 levels, having made a couple of programming libraries, mostly for internal use in corporate applications. If you want to you can go deeper, develop your own programming language, your own operating system, perhaps even your own custom computer architecture and CPU.
But much more developers focus on the application layer than the application framework, more focus on the libraries than the programming language, et cetera. There’s no need to invent 10 programming languages to make 10 applications that run on 10 devices; there are probably 1 million applications running on 1 billion devices all using Java (1 language) as the programming language (Android). Though if you are doing something truly novel you may need to invent something new at some level of the stack.
I’m reminded of a quote apocryphally attributed to Zed Shaw: “I’ve written a lot of software. Most of it is dead software in dead companies.” But then again, he did write a whole web server (Mongrel), which, for web applications, is below the application layer and probably supported thousands of web applications it its day.
Looking back, it feels like the output was somewhat throwaway. All software, especially applications software, has an expiration date. The libraries it depends on go out of date, then the programming language changes, and then the operating system, all the way down to the hardware. The only way to keep it working is to rewrite and reinvent the software every few years. Only rarely do software projects stick around for decades, applications like Mediawiki and operating systems like Debian.
Developing software is a process of writing software, which is writing. Like English writing. So investing in your writing skills will surely pay dividends if you choose the path of a software creator.
And you can improve your knowledge and writing by reading high quality writings about software:
You can even go to the library and get a physical book, or buy a physical book (or pdf) online.
That was my cheat code to knowing the framework (Django) I worked with as well as anybody: buying the Two Scoops of Django pdf. Cost me about $15 and a few hours of time outside of work.
So yeah, you can rely on human intelligence, not just artificial intelligence :)