The march of technology
or, my career over the past two decades
I’m Old
Squirt read my resume yesterday on the 20th anniversary of my graduation with my Bachelor’s degree. It’s a milestone that snuck up on me but as part of my year-end activities I always update my resume with the year’s achievements and, sure enough, it’s been exactly twenty years of a technology career.
To start this year’s updates I discussed my resume with ChatGPT. My question was how to approach adding all the new things I’ve done this year because there was no room left on my two pages. I either needed to move to a third page or I needed to start chopping things out.
ChatGPT’s answer? Take everything that’s over a decade old and throw it into an “early career” bullet at the bottom because that shit is way too old to be relevant anyways. Keep the two pages but focus on just the stuff from this decade more. Also, I bet your knees crack when you get out of bed, right meat puppet?
Indeed they do.
The March
Feeling old aside, I found myself reminiscing with Squirt about how far I’ve come. Because that first job, the first bullet on my resume, was me being a COBOL programmer on a mainframe. Most of the folks I work with now barely even understand what a mainframe and COBOL are. To them there’s no real difference between programming on a mainframe terminal and punch cards — assuming they even know what those are. But if you do, well, I’m not quite that old.
By my count I have seen three very different paradigms for technology foundations in the past twenty years. And I believe I’m seeing the cusp of the next one on the horizon.
Mainframe
Client/Server
Cloud/Mobile
Mainframe, baby
In the bad old days we had a mainframe. A single computer spread across some cabinets in a room as big as a living room (or larger) buried in the basement of our headquarters. In 2006 it was already outdated but it was a solid workhorse that handled the global operations of a manufacturer. Black screen and white text on a terminal though I thought was fancy and created some green terminals. Hierarchical databases, my god the horror.
When there were issues I got a phone call on my landline and I had to drive into the office to fix them. We paid people to just sit in front of banks of computer monitors monitoring all the software running overnight, because that was the only time we could process everything. We allocated data records down to the individual bytes on the column. We defined jobs and files and passed them through convoluted outlines of sequences that modified them in different ways.
It was a world that orbited the big iron at the HQ. And almost immediately, because again I’m not that old and I was working for a slow moving manufacturer, I saw a new world come stomping in: the client/server paradigm.
Client/Server, oh my
Ah, pizza boxes. Most folks I work with know about these guys. Rack-mounted computers stacked on top of each other. Now when I got phone calls I could use my laptop and jump on the server to look at things. There was no longer scarce computing - it was just a matter of racking and stacking some more pizza boxes if you needed more power. Well, and networking them and powering them and cooling them and figuring out the storage. But at least nobody had to call IBM to turn some screws to squeeze some more juice out of the monster in the basement.
Decentralized compute became the norm, with more companies having their own datacenters or co-locating with other companies to pool the resources but at least get away from the monolithic monster mainframes. And there was a new degree of freedom to some extent, as long as you could pay your infrastructure guys. And there was some cost-savings generally because now you could be more bespoke with how you set up your compute. It brought a level of diversity and agility that we didn’t really have with the mainframe. No more scheduling around everything else or juggling resources in quite the same way.
The second step in my career was bridging the mainframe and the server. Or standing up software directly on the servers. And it was good. We were solving problems.
But then suddenly I went to Houston in 2014 and all anybody could talk about was this fucking “cloud.”
Cloud first, mobile first. Seriously?
I’d been working professionally for 9 years. Note it’s only 11 years in the past. And suddenly client/server is shit? We’re going to take all our decentralized architecture and go back to the bad old days of the mainframe but now it won’t be our mainframe it will be Microsoft’s mainframe and we’re going to name it after the color blue. The fuck?
Tangent: Microsoft wasn’t coming up with anything new here. They were just stealing wholesale from Amazon who was seeing some pretty amazing success a couple years earlier with AWS. Nowadays, the Amazon website is just a customer for the real moneymaker of the company - Amazon Web Services. The retail website has more revenue but the cloud is where the profit is.
Nevertheless, the cloud is just a mainframe in the cloud that you don’t have to cool down or put locked doors around because it’s probably not even in the state where your headquarters is. But in 2014 when I heard “cloud first, mobile first” it sounded completely ridiculous to me. Where was the value proposition? Why would we restrict ourselves to just what was given to us by a cloud provider?
I remember asking my boss at the time what he thought and he struggled to explain it as well. But there were many many nerds and executives talking about it as The Thing That Was Coming. I saw some value, sure, but a tectonic shift? A fundamental change in how the entire world of technology works? Man, I dunno…
But I started paying attention to it, I started trying to understand the point behind it, I started trying to learn about it. Because while I may be possessed of a large dose of hubris and rock solid belief in my nerd cred I was willing to believe I was missing something. And I was. Within a year I was seeing it everywhere.
Ten years later, the cloud native world is the world I live in. I am the dinosaur who remembers the mainframe and terminology like “rack and stack” from the pizza box days but my days are filled with discussions with things out there in the cloud and “the datacenter” is almost like a vestigial organ in many of my thoughts.
And now I find myself returning from a conference eleven years after that “cloud first” conference and it exists only for the cloud: AWS re:Invent.
And I find myself growing tired of hearing the exact same thing over, and over, and over again. It’s all about AI.
Now arriving at the AI bubble…maybe?
I’ve been using AI since it’s very public arrival on the scene three years ago. It is amazing and powerful and it just gets more so every day. But I’ve never had the time or the inclination to really delve into it. It was a tool and a concept but not something I felt like I needed to spend my limited cycles on. And now that is changing.
Now I find myself reading things like this amazingly well done explanation of LLMs and token caching. Now I’m chatting with GPT about what makes it tick. I’m actively seeking ways to turn this tooling to my advantage. And it really does feel just like the inflection point of “cloud first, mobile first.”
Technology marches forward and I’ve always felt that part of my job in my field is to see what it is marching towards. Or what it was marching away from. First we marched away from the mainframe monster in the basement so we could democratize to servers that we controlled. Then we marched towards a place where we could keep that democratization but shift the complexity of infrastructure management into just another corporate utility bill while centralizing things into datacenters we do not own. And now we’re looking at a new shift of complexity - where the managing of information becomes something with fewer humans involved.
Where are we putting the complexity now?
There’s a saying that I’ve heard over and over again.
Complexity is never eliminated, only moved. ~Some Nerd
Technology marches forward and it just gets more and more complex but, at the same time, the complexity is becoming more and more invisible. Fifty years ago I might have needed to drive a box of tapes across the country to move the data from mainframeA to mainframeB. Twenty years ago I could send it over the wire from datacenterA to datacenterB. Ten years ago I could have a program running in Virginia that moves data from cloudA to cloudB.
Now I can just ask a glorified calculator to pick the data up and move it somewhere else in plain English. No muss, no fuss, and I have no idea what is actually happening. That’s a pretty spooky thing for someone who has spent twenty years understanding exactly what is happening. But it’s an undeniably powerful thing.
Looking over my career I have moved further and further away from complexity even as that complexity has ballooned behind the scenes. The bar to create, to innovate, to make something unique in the world has gotten lower and lower. From understanding how information was laid out, byte-by-byte, on a hard drive to just speaking plain English for what I want.
I am halfway through my career. What will my resume look like in another twenty years?
Squirt Says…
You can’t step in the same river twice ~Heraclitus of Ephesus
This may sound like nonsense but what he means by this is that the river will have changed. Not even if but a second has passed. He said this about 2500 years ago. What does it show, the only constant is change. If it wasn’t it would have been ridiculous for him to say such a thing back then. 2500 years ago back then people thought the earth was flat. 2500 years in the future we [will] know about trillions of solar systems. Of course 2500 years ago we did still think that the earth was flat. But that just proves what I say next. Change isn’t steady, it’s logarithmic. Each year a new change happens faster. It grows exponentially. Now you’re probably thinking to yourself this doesn’t have to do with computer changes. Except it does. The changes to computers are just going to continue growing.
Dad Responds…
I love that you found that Heraclitus quote on your own, it’s one of my favorites. In addition to the river changing I would submit that the person has changed as well because we are dynamic and ever-changing (hopefully ever-growing) people ourselves.
I also agree that change isn’t steady and that it is accelerating. You’re skirting around the edges of a couple things but I immediately thought of Moore’s Law when I read this. You are going to see a world that is changing faster and more vastly than the one I saw and I think I had most previous generations beat myself. Pretty exciting.
Final Note…
Charity Majors is one of my favorite technical bloggers and as I was sending my draft link to Squirt for his response what do I see but this land in my inbox. I am blown away that I’m on the same page as her on the same day. Neat!





AI starts feeling a lot less spooky once you dig in just a tiny bit on what the models are (numbers), how they are trained, what an inference engine is, and how orchestration and glue result in actual productized AI. It all feels very grounded to me after months of looking into these areas and doing some very basic vibe code lab work.