Musings of a 60 year old Tech Daddy
Well, I just celebrated the completion of 60 years on this planet.
That’s me on the left with the glasses. I use this picture, because my brother Stephan and I (on the right) started our first company ‘Adamation’ back in 1984. We were wild eyed, and ambitious, living in Silicon Valley, trying to make it big. Well, we didn’t become an Adobe, but we held our own, and our early journey is documented in the Computer History Museum of Silicon Valley: https://computerhistory.org/blog/meet-the-adams-brothers/
That history piece was done quite some time ago, and the story kind of leaves off with my time at Microsoft. But wait, there’s more…
This post is about reflections on roughly 48 years of coding, and being a ‘Technologist’ and ‘Futurist’.
I started ‘coding’, if you can call it that, when I was about 12 or 13 I think. I self taught myself 6502 assembly, and learned ‘Commodore BASIC’. I mean those were the heady days of the earliest ‘personal’ computers. I spent endless hours making that poor Commodore P.E.T. do things, to the point of actually burning out some of the interface chips on the motherboard, because I was playing with an early Fairchild video game device joystick, trying to make games on the thing. That was back in the days when you could actually de-solder a chip on the board and replace it. Now, you can barely even open the case of a computer, let alone replace any singular component on it.
I think back on younger me, a very shy introverted child who found his element with computers, and I just think “Man, was he lucky”.
My first formally learned language was actually Fortran, taught to me on punch cards are Rockwell International. In the mid to late 80s, Borland was THE place, so I learned things like Turbo Pascal, and later TurboC, with Turbo Prolog thrown in as a brief aside. I became a veritable expert on the earliest C/C++, then along came NeXT computers, with “Objective-C”. A whole new way of thinking, more like SmallTalk, with message passing between objects. We did a great many things, achieving local notoriety on the NeXT platform. Then along came Taligent, who purchased a license of some of our greatest work, and some of my time, to incorporate that tech (collaborative document editing), into their burgeoning (never quite made it) platform. That was a switch back to programming in C++ on steroids!! There was not a feature of the language they did not love. They were heavy on the usage of templates, and exceptions, and Everything was a class, all the way down. These days, I avoid templates and exceptions, as much as possible.
I finally joined Microsoft in 1998, and it was further down the C++ rabbit hole, except this time it was in the early XML team, which was busy making a runtime for XML/XSLT, which looked like Java, so there was a garbage collector and whatnot. Then along came C#, although early on I think it was called “lightning”. So, all that C++ knowledge kind of got thrown out the window, although garbage collection in general was a good thing to reduce the number of the most common bugs found in most software. Thus, I became a C# expert, for about a decade I’d say.
During this whole time, since my earliest days on the Commodore 64, I kept developing, and re-developing graphics libraries. Mostly 2D, but sometimes entire 3D rendering pipelines. It seems a constant backdrop, because it’s a familiar area, so it’s easy to learn a new language while creating a new graphics library. Only a couple of those iterations have ever found their way into commercial products, but it’s been a constant in my programming journey.
After C#, I stumbled across the Lua programming language, and subsequently LuaJIT. I mean scripting? Until that point, the cosest I had been was the XSLT ‘language’, and that’s about it. I fell hard for Lua though. I mean, so compact, so concise, so capable. With LuaJIT specifically, it’s so easy to interact with other libraries written in ‘C’, that it’s a non-brainer to consider using it. In modern times, Python seems to be taking the scripting by storm, particularly for AI, but I actually don’t love that language that much.
Today, I’m deep into C++ again, preferring it to the garbage collected stuff. I feel that I’ve finally learned how to manage memory properly, so the performance gains and control are worth the risk.
And now we have “AI”.