Skip to content

Career

Early Days

I've been messing with computers since 1982, when my dad gave me a TeleVideo TS802. It ran on the CPM operating system. in those days, there wer every few software programs available. IIRC, the suite of tools consisted of SuperCalc, one of the original spreadsheets, and Wordstar, one of the first editors.

I took that computer and an HP dot-matrix printer to Boston University with me, and thanks to Mr. Suzch, I was an agile touch-typist, by 11th grade. That's a skill I've never regretted acquiring. Over the net few years, I typed a lot of papers for my friends. it wasn't until 2 years later that my friend Kristine brought an Apple Macintosh and it's nifty printer.

I wrote a lot of papers on that computer, but having had the dubious fortune to have had a roommate in my Freshman year, who was tied into the weed scene on campus, I found myself in a position to get my hands on enough weed to sell some and smoke the rest for free. Necessity's being the Mother of Invention, it wasn't long, before I had worked out a nice spreadsheet that allowed me to calculate how much I had to sell to pay for the rest. Which I smoked.

At least one friend credits me with having helped get him through school, because I would rewrite his papers as I typed them up for him.

IBM S36

The first or second actual job that I had as a computer guy was working for my father, who had a real estate management company of several thousand units, which he managed with a custom program written in JCL (Job Control Language), a positional syntax language. In order for his employees to access the mainframe, we set up IBM PCs with 3270 emulator cards, connected to the mainframe...some network protocol that I never knew. Networking would come later, in the 90's.

The spreadsheet on those PCs was probably also SuperCalc -- it would be at least 7 years, before MS released Windows and Excel. I remember teaching the CEO of my dad's company, how to use a spreadsheet. "Let's sum up some numbers."

I put the cursor into a cell and typed a number and hit return. Repeated that with the next cell down, then told him to add a few more numbers to the column and then use the sum function to add them up. He entered the last number and reached for his calculator.

So, I showed him. "Use a function to add up the numbers. Then, when one changes (I changed one), the sum updates automatically." Concepts like "Summing a column in a spreadsheet", and "paragraph indents" were brand new to everybody.

That was the beginning of a cultural shift in the human psyche. No longer did we have to, say..add a hard-return and tab over to indent a paragraph. We learned to use these new tools, abstracting out the work in many cases, until what we were just exchanging symbols for formatting and actions. It was the beginning of the same cultural shift that we're seeing with AI.

San Francisco

In April of 1988, I moved to LA, but moved up to San Francisco a few months later. I put y typing skills to use doing temp work at a bunch of random places, including UCLA (they had a dedicated WP matching...a bit of a beast, but it produced formatted output), Haliburton in SF, and ultimately at The Trust for Public Land, a non-profit land conservation organization. I temped in the Legal Department for a few months, and then they made me an offer. I was a Legal Secretary, I loved my boss, and I got to teach everybody how to use tab stops, automatic paragraph numbering -- particularly useful for legal docs -- and whatever other geekery emerged.

In 1991, I left that job and did some odd computer-related jobs, like cleaning out dusty old computers and running disk-repair utilities and diags.

I also began doing some project work, like tech editing of Larry Aaronson's first The HTML Manual of Style as well as several large-format, coffee-table books of lists of interesting web sites. Remember that this was 1995 and there was no Google. There were search engines, but this was a curated list of sites, bound in paperback. There was an accompanying CD that came with at least one of those books.

My job as tech editor was to review the book for technical accurcy, which included validating the site links. In those pre-HTML/1.0 days, the HTML/0.9 spec did not include status codes. It was a simple GET request that looks like this:

GET /docs/index.html HTTP/0.9

As in MIME-formatted content, the GET part is the header that get send verbatum to the server, and it must be followed by 2 blank lines. There's more to it, but the point is that people moved those pgages around a lot, so roughly 40% of the links were broken when I got the MS word file containing the content.

That resulted it he process for checking links' being to paste it into the Mozilla browser and if it worked, verify that it was the page referenced it the book, and if not, I would start popping off directories to see if I got a valid page. It was not uncommon in those days to get directory listings from the web servers, which made my search easier.

The problem was that there were dozens of entries and checking them quickly became a time-intensive chore, since the max speed available for home internet was 28.8K modem connectivity, some sites took a long time to load.

Enter Automation

So, there I was, sitting in the bungalow that I shared with my buddy Larry, looking at this Word doc and wondering how I was going to get through the whole thing. I had recently started playing with Perl. Previously, I had played with BASIC and had done quite a bit of Pascal coding (using Borland's Turbo Pascal), but Perl opened up a whole new world of possibilities, especially for text munging, so I set out to write a script that would do the work for me.

I manually-updated the document, applying styles to each logical section of each web site article. That gave me the hooks i needed to write a macro that would search for a particular style, which in that editor would select the entire block of text that had that style applied. Then I added XML-like tags before and after the string and repeated that for each section type, (title, name, url, description).

Once I had a tagged file containing info on all of the pages I had to check, I wrote a parser that pulled out the URL and the name iterated over all of the URLs, making request for the specified URL and saving the file and all the resources it used to a local dir. Since the entire point of this was to get Perl to do the work with out me and downloading a single page could be painfully slow, I had it run offline.

The script worked through the list, making a request for the specified URL and if that didn't work, popping off the file name and making another request. The script would do that for most of the path, before it failed that link.

At the end of it, I had it generate an HTML table with the name of the link, the success status of the request for that link, as well as a link to the original version on the net and one to the local version.

That was the first time I wrote automation software as well as one of the first times I made money programming.

Mazama Software Labs

Around that same time, I was invited (or allowed?) by the guys that my roommate worked with, who were starting a company to build firewall software for this new thing called Linux. They were a bunch of Unix hackers that knew each other from Eskimo.net, a big ISP in Bellevue, Washington. They needed someone to produce documentation, so one day, David Bonn and Larry Stelmat came over to my apartment on Buchanan St., across from the Mint in San Francisco, and installed a pre-1.0 slackware version of Linux onto my 286 PC, partitioning off 200 MB for Linux.

I spent the next few months poking at Linux, trying to figure out WTF was going on. The first real epiphany I had came when I finally grokked hierarchical filesystems. Ding!

The docs were done in LaTeX, a plain-text typesetting language, which lent itself to being version-controlled. This was my first collaborative endeavor and though I had used RCS and maybe SCSS a few times for version control, the guys used CVS, so I had to learn that. Furthermore, the CVS server was behind a firewall, so I had to learn how to set up an SSH tunnel.

Everybody used Emacs. For everything. Including the docs, which presented a problem for me, since I didn't and vi did not -- to the best of my knowledge -- have a formatter, so i wound up doing it manually...and probably causing unnecessary whitespace diffs. Eventually, I capitulated and decided to learn emacs.

CVS was the primary VCS back in the day, though others were available, CVS, unlike RCS and SCCS, both of which had been coming with UNIX systems for some time, was a bit different in that it could manage multiple files, while the others were single-file-at-a-time only. Like with git, you could apply tags to the entire repo and even execute server-side hooks in response to tagging events. I'm not sure how long my hair was back then, but it was much shorter when I wound up managing CVS at Lehman Brother and Barclays Bank in NYC, more than 10 years later.

CTL-g

Larry and I flew up to Bellevue to meet with the rest of the team. There are two things that stand out from that visit. First, I had been struggling with EMacs for weeks adn was really frustrated by one thing: I kept getting into some kind of command mode and couldn't get back to editing. First thing I did was ask David, "Hey, EMacs, how do you...", and before I finished he just said "Control g".

"Is that how you..."

"Cancel command mode. Yup."

That really cleared the way for me to dig into EMacs. I used EMacs for everything, including mail, until the early 2000's.

Grokking Unix

I must have been working on the ZD Press tech editor project, when we went up there, because I asked David how I would go about writing a script that would track the size of the files that I was downloading. And then I was given the concept of loops in the shell.

The actual shell we used was probably sh, though not long after that I started using tcsh, which were the primary options back then, though ZSH was around then.

This was a crucial piece of the Unix/Linux puzzle for me.

EBerg@slip.net

Around that same time, I published my first web site. The ISP was a San Francisco outfit called Slip.net. Here's a link to a replica of the site. Lots of links don't work, but in those days, I maintained that site and used it to do research and as a web landing page.


More to come...

NYSE

Dan McCormick, future CTO of Shutterstock and Founder and CISO at Constructor.io, was working for a private trading firm, writing trading systems that traded on several exchanges. I was hired by Rosenblatt Securities, their broker through which they wanted to trade directly with the NYSE. Rosenblatt had been the first guy to allow his clients -- non-brokers -- to do electronic trading on the NYSE. And he wanted to get that glory back.