Friday, July 22, 2022

Internet Withdrawal (And my long tortured history with computers)

I have been addicted to computers since puberty.  Where has it gotten me.  Where has it gotten us?

I have been playing with computers since about 1973, when I was 13 years old.  My math teacher, Mr. Wilbur ("Greasy Pete" we called him, because of his love of Vitalis - the 60 second workout!) decided that computer programming could be learned at almost any age.  Before that, only "Seniors" were deemed worthy of learning the ins and outs of writing code.   Mr. Wilbur proved the skeptics wrong.

My first "machine" was a time-shared PDP-8 which had less memory and a slower processor than what's in a simple kitchen timer today. 4K of hand-wound "core" memory, consisting of little metal ferrules or "cores" with wires intersecting and wrapped around them. These were "knit" by hand by little old Italian ladies in the suburbs of Boston - one of them was a grandmother of my roommate at GMI. The "processor" was a series of logic gates - each on its own removable "flip card" hand-wired with a wire-wrapping tool. We communicated with this high-tech monster using a Western Electric teletype machine, hooked up to a phone line via a 110 Baud modem (MODulator-DEModulator) connection. If you wanted to "save" a program you were working on, you could either print it out (and re-type it later) or make a copy on "paper tape" which would spit out from the side of the teletype in a 1" wide yellow paper strip. Later on, you could run this tape through the reader (also built-in to the teletype) and "load" your program - basically re-typing the entire program using the automated tape feed.

And I was addicted.

I would spend every spare moment playing with that computer. I would type in programs I found in "Byte" magazine, or faded copies on brown printer paper (tractor feed? Don't get fancy!) handed to me by older boys. I wrote a few original programs of my own, including a "Star Wars" program that simulated - via text - the climatic ending of that movie.  My other math teacher - Miss White - the fireplug-shaped lesbian, would scream at me as the rat-a-tat-tat of the teletype machine interrupted her class.

It was pretty primitive.  I was hooked.

I went away to prep school and they had a brand-spanking new PDP-11 with dual 8" floppy drives! Wow! You could write a program and store it on a disc. I learned early on, to my dismay, the importance of backing up your programs, as those old low-density floppy discs could easily un-format themselves and you'd lose all your data. It was fun and again, I spent all my free time playing on the computer. We even had a DEC VT-100 monochome CRT monitor! No more punching keys on a teletype or wasting paper. And speaking of paper, we had a Digital Equipment LA-36 DecWriter, which used 11" wide tractor-feed "computer paper" in alternating bars of white and blue - just like the big boys had!

I went away to college and it was harder to get my computer fix. GMI had a hoary old IBM 360 mainframe on which we were "taught" to program FORTRAN on punched cards. You used a punch card machine to write your program - one card for each line of code - and then you presented it to the computer Gods who ran the almighty mainframe. Overnight (!) they would "run" your program and present you with a list of error codes it generated (if any) the next day. If you were like me, and wrote code and then ran it to debug, this was frustrating. It was like having sex over long-distance, or one of those chess games people play by mail.

I found my fix elsewhere. At the factory, they bought a CP/M machine which sort of looked like what would become the IBM PC in a few years - a CRT (monochrome) monitor, a box containing the 8080 processor and S100 bus, and a separate keyboard. If only Gary Kildall had signed that license agreement! Bill Gates would be a third-shift IT guy at some obscure company today.

Of course, this being General Motors, someone had ordered this computer on the premise that we were moving into the future and needed "computers" even though no one really knew what for. Of course, being all of 18 years old, I turned it on and started playing with it. One of the draftsmen said, "turn that off! You'll break it! That's an expensive computer!" Translated: Stop showing me up. Computers scare me. Make it all go away!

I left GM and went to work for Carrier and bought an IBM PC Jr. What a piece of junk. To save money, they eliminated the parity bit in the memory and processor, so it ran on seven bits instead of eight, insuring regular crashes. Inexplicably, they made it incompatible with all regular IBM-PC software. Sure, you could "run" some programs (like the early version of flight simulator) but the colors would all be "off".

At work, they had a motley collection of machines, including a TEKTRONICS graphics computer which used a phosphor screen. This kind of screen didn't need to be "refreshed" at 60Hz or whatever. You drew on it and the image stayed for hours until you "wiped" the screen. It also had a four-color (four pen) X-Y plotter that could draw graphs. For some reason, people think these are a new invention, and I see "Tick-Tock" videos online of an X-Y plotter drawing funny things and people saying, "Look! The computer can draw!" as if this "invention" that was around in 1981 was novel. Whatever it takes for those Tick-Tock hits, I guess.

Anyway, we used the TEKTRONICS to input data and draw graphs, and like at GM, there was a mainframe department and the acolytes there guarded the temple with fury. No one was allowed in! No one may program the sacred machines! They stooped to writing a simple program for the TEKTRONICS, but charged our department thousands of dollars. My boss was excited I could write better programs for free. And like a lot of machines in that era, it ran on BASIC, which was clunky, but easy to use. Of course, each brand of machine had its own version of BASIC, so you had to learn the nuances between them. For example, the TEKTRONICS machine had its own graphics command set to run the X-Y plotter and draw vector graphics. Slowly.

We actually had an old PDP-8 in the heavy machinery lab, which logged data from the test equipment on giant "hard drives" the size of garbage can lids. These "disc packs" could be removed (they probably stored 1MB or so) and replaced. We also had those reel-to-reel tapes, as I recall, like you see in old movies, whenever the plot calls for "a computer". The PDP-8 ran a hoary old program in BASIC that the lab engineer had written years ago. They thought about replacing the PDP-8 with a Hewlett-Packard "Mini-Mainframe" computer, but by then, you could buy a surplus, new-in-the-box PDP-8 for a few hundred dollars. So they bought that and delayed buying the H-P for a few years. The mighty PDP-8, once time-shared, and once worth tens of thousands of dollars, was now just so much scrap to be thrown away or sold for whatever price it could command.

There is a pattern here - stay tuned.

They wanted to test an Hitachi centrifugal chiller in the packaged A/C building in TR-20, which didn't have the same computer setup. They had a PR1ME computer (no, that is not a typo!) because UTC foolishly bought PR1ME. Think IBM mainframe, but an orphan. The PR1ME was a piece of junk and could not log data in real-time from a number of test rooms. So they bought a plethora of Apple III computers with floppy drives and color monitors (!!) to "store" the data and then later upload it in a stream to the PR1ME for "processing." The data would then be "downloaded" to the Apple III to be plotted on an x-y 4-pen plotter (again, not a novelty back then, or today). It was a cantankerous solution to the problem, and done only to save face for executives who ordered the PR1ME computer for TR-20.

Well, I looked at the setup and realized the Apple III was more than capable of logging the data, processing it, and printing it out. I got a copy of the original PDP-8 BASIC processing program from my friend in the TR-1 heavy machinery lab, and entered it into the Apple III (making backup copies to be sure!) translating when necessary from DEC Basic to Apple Basic. And it worked.

Of course, I also had to install the chiller, wire it up and instrument it - and then connect the instruments to an interface board that converted the 4-20mA analog data to digital and then loaded it into the Apple III. But after it was all done - in less than a month - We ran the machine and printed out the data. Some "suits" came by to see the work and I ran a test run and showed them how it could print out a graph in near-real-time.

"That's all very well and fine," one said, "But where is the step where you upload the data to the PR1ME computer for processing?" I didn't realize the guy I was talking to was the one who ordered the expensive and useless PR1ME computer. So I replied, "Oh, we don't need to do that - all the processing is in the Apple!"

Awkward silence.

I am sure they agreed that "the kid" didn't know what he was doing and the data was no good, and probably re-did all the work using the PR1ME computer (and paying a month's salary to a "programmer" to rewrite the processing software in COBOL of whatever, and then translate it all back into BASIC for the Apple to plot). But at that point, I was back in school working on my Engineering degree.

School was a computer desert, oddly enough. One of the professors there was keen on an obscure computer language called "APL" as he had a hand in promoting it. To program in APL, you used Greek letters as commands. So the first thing you did was tear all the keys off your computer and replace them with new keys with these Greek letters on them. I'll bet that asshole at S.U. had a hand in selling the replacement keys, too! If you had an oddball machine (like the ill-fated PC Jr.), then you had to put little stickers on the keys. The idea was, each greek letter stood for a command, so instead of having to write "LET X=1" which took six keystrokes, you could replace "LET" with <alpha> (or whatever, APL freaks can go fuck themselves) and save three keystrokes!

The idea was that everything was expressed as vectors and arrays, so you could process whole arrays of data at once, which might be useful, for example, in graphics rendering (they claim). But I am not so sure. Maybe it makes it easier for the programmer to write code to process a graphics array, but the machine still has to crunch the numbers, one bit at a time. Tellingly, not many people use APL today, and the Wikipedia entry seems like a long-winded apology for it. Some claim it is not even a programming language but rather a calculating tool.

What does APL stand for?  "A Programming Language" - the language itself is equally as clever.

As I noted before, back then, "Computer Science" wasn't considered "Science" but "Art" and if you got a Computer Science degree, it was a Bachelor of Arts degree from the Liberal Arts department, not from the Engineering school. "Computer Engineering" didn't exist. There was Electrical Engineering - where you might learn how to design a microprocessor, but not how to program it.  Even the Patent Office didn't recognize "Computer Science" as "Science" and would not hire CE graduates as Examiners or indeed, allow Patents directed to computer software (which, they argued, was just human thoughts expressed electronically).

This reflected a longstanding prejudice against programmers (which is probably justified - IT people and gamers are gross, let's face it!) The money was in the hardware - the mainframe and the circuitry. In the case of IBM, the money was in the punch cards - they sold (leased) the machines at cost and made their money selling you punch cards, which like with a McDonald's franchisee, you had to buy from them only. Woe be to the IBM user who was caught using 3rd-party punch cards. They would rip the leased machine out of your building overnight.

Programming was seen as merely "using" the machine. Think of it this way: You sell a car to someone, and you make money selling them a car. You don't pay them to drive it. And that is how computers were viewed back then. You hired a "programmer" who was paid very poorly (compared to today) and the major cost was the lease payments on the mainframe, or maybe your punch cards. Programmers didn't have the stature or the leverage they had to day.

But at about that time, of course, the IBM-PC was introduced. And while it took a few years to get going, it turned out that the key to the personal computer wasn't in the hardware, but the operating system and software. When I bought that PCJr, I didn't buy programs with it (like a primitive checkbook program for $99!) because I always wrote my own programs - that is what one did back then. Off-the-shelf software? Pray tell, why?

But just as the folks at Syracuse University were blinded by APL and the dichotomy between "real" Engineering and computer programming, many of us at the time were blinded by the new reality of the Personal Computer. Thousands - and eventually millions - of computers out there, and all of them would need software and 99.9999% of the users would be unable to write their own code (and unwilling even if they could).

Like the orphan CP/M machine at GM, people bought computers without really knowing why they wanted or needed one. They needed programs to run, and that's where the real money was to be made. And like so many other people, I was blind to the opportunities in computer programming. Programmers made little money back then, because they were captive employees of one company, and wrote custom software for that company. My Dad's little truck-clutch factory had its own IBM 360 mainframe, along with a number of "operators" and at least one programmer. They were slated to replace it, and by then you could buy PCs and network them together. But the IBM guy said some silly little PC couldn't do all the heavy lifting of a mainframe computer! The opposite was actually true.

But the point is, a programmer was on salary to one company and wrote software for that one company. In the PC world, you would write a program and sell millions of copies of it - hundreds of millions, in fact. It is akin to being a local lounge singer, playing the piano and signing for tip money at the local bar, and being a stadium-filling, Platinum-record-selling superstar who is worth millions. Just as mass-media made superstars of a few people, the PC ended up creating tech superstars.

And most of us were blind to this at the time.

I digress, but many think this is the start of the Bill Gates story - that he saw this all coming and "designed" DOS and Windows and became a billionaire. The reality, as I noted before, was that a lucrative IBM contract to supply IBM-DOS landed in his lap. He paid someone else to write the first version of DOS, and pocketed the difference. The contract allowed him to sell copies of DOS to third parties. IBM did this because who would want a copy of DOS? After all, IBM gave it away for free with each computer. But the "clone" market took off and eventually forced IBM out of the computer business, and all those clones needed copies of DOS - which Gates pretty much had the monopoly on.

I graduated from Syracuse University having learned little or nothing about computers or computer programming. What I knew about programming I learned on the job - as so many do. You spend 40+ hours a week doing something, you learn it. You spend an hour or two a week in class, you get a rough idea.

I went to work for the Patent Office and we had no computers there, other than shared terminals for UNIX machines that could access the Patent Office database, using obscure Boolean search terms, displayed on monochrome (orange) screen or printed out on a nine-pin dot matrix printer. Many of the Examiners would bring computers from home, as writing out office actions on paper with pencil was tedious. I still had that old PCJr., believe it or not, and I guess it worked better than I remember, as I used it for a year or so at the PTO before buying a new Hyundai 286 machine, with a 15" VGA monitor and a 20 MB hard drive (!!) for about $2500.

By then, I had not only had a nine-pin dox matrix machine hooked into the PCJr., but a Smith-Corona daisywheel "letter quality" printer. It was basically an old Smith-Corona typewriter (from Ithaca New York) without a keyboard. When you printed with it, it sounded like a machine gun going off. One of the other Examiners "found" an acoustical enclosure for it, and it sat on its own printer stand. When the new Hyundai arrived, I bought a 27-pin "near letter quality" Panasonic KXP-1132 printer and ditched the daisy wheel and the nine-pin. I was in the big time now - running WordPerfect and all. I could even "dial up" to the USPTO database using my US Robotics 2400 Baud external modem. Whoo-whee! That was lightning fast!

I mentioned the Patent Office database, and IBM comes into play once again. Back in the 1960's, the Patents were typeset to be printed, and they used IBM tapes to record the typesetting instructions. Someone saved all those tapes, from the 1960's onward and by the 1980's someone had the bright idea that maybe you could put that data on a large hard drive or drives, and then index it and make it into a searchable database. It was not so much designed as a database, we just happened to fall ass-backwards into it. The computer world is full of such happy accidents.

Although PCs were well known by then, and networking was even a thing, the USPTO was dogged by an antiquated WANG computer system. Seems the office wanted to go to PCs and a network, and they put it out for bid. WANG, by then a struggling mini-mainframe company, protested the bidding process, saying it was "unfair" that a certain operating system (DOS) should be used. They won the protest, and the Patent Office was saddled - for a decade - with some surplus WANG machines that kept that company solvent for yet another quarter.

So when I typed my office actions on my computer, I could print them out and then hand them to the typist, who would then re-type them into the Wang machine. It was a laborious and cantankerous way of doing business, Eventually, with the rise of "near letter quality" printers, we were allowed to use our own printouts attached to the Office Actions, rather than have them re-typed.

After two years, I left the Patent Office and went to work for a law firm. Once again, computers were not provided, and many of us had to bring our own computers from home. This was 1990, by the way, not some dark ages of the 1980s. Everyone - and I mean everyone - ran WordPefect back then, and they had a near monopoly in the word processing field. It was easy to exchange documents via 5-1/4" floppy disks in the pre-networking world.

Windows became a thing about that time, and I ran out to buy a copy - and a mouse and a mouse "card" and a memory "card" along with 256K RAM "chips" to populate it (via soldering iron). I upgraded my old 286 machine and got...nothing. Windows ran slowly and unless I wanted to spend several hundred dollars on a "suite" of Microsoft Windows programs, it was little more than a glorified shell program. I sent it back to Microsoft and asked for my money back - which they cheerfully gave.

The problem with being an early adapter is that if no one else is on your platform, you are kind of out there on your own. Imagine being the only one on Facebook - kind of pointless, no? Since the firm I worked at was on WordPerfect and DOS, running windows was kind of pointless.

Of course the next year, the firm bought all new PCs for everyone, and they all ran windows and were networked together. 486 machines! Who-hoo! Maybe even 17" monitors, too! And we had access to the network's HP-4P "laser printers" which churned out documents and smelled like diesel engines.

Our firm made the mistake of going to WordPerfect for Windows, which required that everyone re-learn how to use WordPerfect. WordPerfect wasn't a WYSIWYG system - you had to embed "codes" in the text (like HTML) to format the document. Each one of the "function keys" performed a different formatting or other function, and after a while, you got used to hitting CTRL-F7 to do something (I forget what - print?) and you could really make time if you were a keyboardist.

WordPerfect for Windows changed all that. Actually WordPerfect 6.0 for DOS did as well - changing the function of the function keys so that "save document" was now "delete all documents and format hard drive" or something to that effect. Did Bill Gates plant a mole at WordPerfect to destroy it? If not, it certainly worked out that way.

So like the rest of the world, overnight, we all switched to Microsoft WORD and WordPerfect went from flying superstar to out-of-business overnight - a pattern we would see again and again in the brave new computer world.

Again and again.

Computers became less of a novelty and more of a tool. About that time, I was "building" my own machines, which is to say, assembling them from parts. That was the fun of owning a PC, as compared to a MAC. You could go through catalogs and pick out parts and find the best prices - or swap parts with friends or find them for sale locally (on a dialup BBS!) - and "build" your own PC. The day you put it all together and turned it on and loaded DOS - and it worked! - was a magic day. And it never ended there, either, as you could constantly upgrade your machine as time progressed.

I ended up building several. Back then, a new computer was still a couple of thousand dollars, so building your own could cut that cost in half, particularly if you could "find" some parts laying around. Parts were expensive back then. But prices came down. By the time I had my own practice, I was chagrined to see that DELL was selling complete systems for about $500 - so I ordered four and networked them together. I recall buying parts to repair one of my old computers and being shocked that a keyboard could be had for as little as $10 and a complete case and power supply for only a little more. On a $500 computer, the most expensive component was Microsoft Windows.

The world had changed and no one told me.  Software was now the main game.

About that time, I was flying out to Silicon Valley to write Patents for tech companies. My computer education, such as it was, was quickly becoming obsolete. A lot of the new tech made little sense to me. Why create a new "Universal Serial Bus" when parallel ports are so much faster? And what's all this hullabaloo about "wireless" communication? And why did we name it after some Viking dude with bad teeth?

Back then, the PC was still not king. Sun Microsystems was making and selling something called a "SPARC Workstation" and all the Silicon Valley techies were using them, along with something called "MATLAB" to emulate circuit design. Yes, circuits. Hardware was still king in Silicon Valley, where actual silicon was made, or at least designed. But a sea change was coming. The messy business of designing and making hardware - which was quickly becoming a commodity item - was fading or being off-shored or sent to places like Boulder or Austin. Software was now king.

And the Internet - which had been around for decades - started to become a "thing.. About that time, we all signed up for an "ISP" and dialed-up to access our "e-mails" and primitive discussion groups. Websites started to become popular, and we all loaded Netscape Navigator to view them. I was not amused - I preferred to run my computer in "terminal mode" and type in my commands on the command prompt - and read data in ASCII format on the screen. It was fast and simple and there was no slow-loading graphics to distract you. And back then, websites were pretty useless. You couldn't order things online or check your bank balance - the system wasn't that secure!

Of course, all that would change over time, and I was dragged, reluctantly, into the Windows world and into the world of the World Wide Web.

For most people at the time, computers and the Internet were a literal waste of time. People would buy a PC or get one for Christmas and then have no idea what to do with it. You could play a few simple games or balance your checkbook or use a primitive word processor. Sears attempted to start an online service called "Prodigy" and sold computers as well. Sadly, they never synergized this with their famous catalog to create the first online shopping portal. The bandwidth was just to narrow and the graphics too basic.  And catalog shopping was on its way out - Malls were in!

We were all so blind - even major corporations!

America Online or AoL filled in that space - giving ordinary people "something to do" on the Internet. It had primitive discussion groups and even retail outlets. You would type in an "AOL Keyword" to find a merchant's site, and for a while, every billboard and TV ad would obnoxiously mention the advertiser's AOL Keyword - much as many today say, "find us on Facebook!"

AOL was the wading pool of the Internet, and initially, AOL made it hard for users to access the real Internet. But eventually they had to relent, particularly as the World Wide Web thing took off. AOL users found themselves in the deep end of the pool - filled with sharks - and often found themselves making fools of themselves. "AOLamers" we called them, and having an "@aol.com" e-mail address marked you as a novice. Overnight - once again - a high-flying company came crashing down. AOL once bought Time-Warner. Time-Warner had to pay someone to take AOL off their hands.

Gee, never seen that happen before, eh?

You see a pattern here - companies make it big in the computer world and are forgotten just as quickly. My sister-in-law's family was involved in the creation of Lotus 1-2-3 spreadsheet software. Yea, it is gone as well, replaced with Microsoft Excel. They made a lot of money during their brief time in the sun - but the sun sets quickly in silicon valley.

Most of the companies I have discussed so far have faded from view. HP, Wang, DEC, Lotus, WordPerfect, AOL, and even IBM - which does some sort of software thing now and is the same company in name only. There would be many more to skyrocket and then fizzle and fall - and many more to come in the future. Exactly how many social network apps can the world support - particularly since many are losing money? It is an interesting question that will be answered in due course.

Of course, the story doesn't end there. In fact, the modern era just started. After the Windows 95 tech crash, a lot of hardware companies, hoping to sell new machines to run Windows 95, went bankrupt. Seems that people stopped being so anxious to "upgrade" every time a new version of their favorite program came out. The old version of WordPefect worked fine - why change horses. If you can wait even a few years, you can switch later on, when the new program has all the bugs worked out.

So we had a world where almost everyone had a computer - or at least people with means, which is a nice advertising demographic. What could they do online, other than work? Well, games started to take off, and online shopping sites like eBay skyrocketed, taking PayPal along with it. And you could buy books on Amazon or download them to your kindle!

Discussion groups were still a thing, of course, but they left the "alt" platforms of primitive text sites and were increasingly hosted on websites. The "alt" groups were largely un-moderated and they became inundated with SPAM and toxic content to the point where they were unreadable. When only 1 out of 100 messages is from a real person, well, it is just noise drowning out signal.

I used to go to discussion groups, for example, to exchange repair tip for BMWs. Over time, some groups gained in popularity, while others fell by the wayside. Most of these sites lost money or made little for their owners, so they sold them off and the new owners "changed direction" and erased all the old postings and photos.

But I noticed something else. Many of these sites were more than factual discussions about repairs or cars or whatever, but became social sites. Sometimes people wanted to meet up in person or attend events. Other times, they just wanted to chat with each other. So you are discussing repairing a control arm bushing and "Ted" chimes in with, "so, Ed, how's the wife and kids?"

People wanted to socialize online. And this was the "genius" of Mark Zuckerberg, such as it was. Sites like Second Life and MySpace preceded him and has some limited success. But they kept screwing the pooch as they never made money and when they tried to monetize in the worst possible way, the users fled. Facebook was in the right place at the right time, when there was a critical mass of installed computers and a lot of bored people looking for something to do, other than play computer solitaire.

And it took off - although like its predecessors, struggled to make money. Early ads on Facebook were mostly for weight-loss scams or outright identity theft. "Obama want's Moms to go back to school!" they would scream, while showing a disturbing photo of some homeless man. This was the future of the Internet? Hard pass!

Zuckerberg did see that "mobile" was the future for Facebook, and by this time, having survived so many computer world booms and busts, I was fatigued of it all. Now we have to buy smart phones? Ugh! This wasn't something I looked forward to. My love affair with computer died with Windows - when I could not longer "see" what was on my hard drive but rather had to trust Microsoft to do what's best for me. My love for computers died when you could no longer just shut them off by hitting the power button.

And smart phone are more of the same. You're never really certain whether some sort of virus or bot or whatever isn't running on your computer or smart phone anymore. There are too many layers of bullshit between you and the 0's and 1's and this is mostly because the vast majority of users would break their machines if allowed access at the root level.

So I resisted buying a smart phone for as long as possible. I saw what it did to people - and what it would eventually do to myself. People had to have these damn things with them at all times, stare at them, scroll, hoping for a hit of dopamine - that someone would "like" their Facebook posting of a picture of their cat. People would scroll Facebook on their phone while you were talking to them. It was rude and yea, we've all done it - which makes it so much worse. Somehow Zuckerberg tapped into the darkest part of humanity - and won it over.

Computers turned from a useful tool to an annoying distraction to a highly-addictive electronic drug. And as people spent more and more time on "social media" they started doing more and more odious things. Political opinions started to move to the extremes. Ideas that had been dumped in the trash-heap of history were dug out, dusted off, and given new names. Communism, Anarchy, Nazism, Fascism, White Supremacy - all proven again and again to be really really bad ideas, have been re-branded as "Democratic Socialism" or "Antifa" or "Neo-Nazi" or "Alt-Right" or "Replacement Theory" and no one seems to notice.

It took a long time to get where we are today, but it seems the pace of change is accelerating. Streaming services are quickly overtaking cable television, which is struggling to adapt to this new reality. But like the alt-discussion groups in the days of ASCII, cable TV has turned into little more than SPAM these days - 50% of airtime on many channels is for advertising - a ratio which makes it unwatchable for all but the most dense of people - who are often in no position to buy the advertisers' products.

Where we go from here and what the future has in store, I do not know. Based on the patterns I've seen, the one sure constant is change. Companies that seem to be part of the landscape today may be gone tomorrow as the short-attention-span of humanity moves on to the next brightly-colored object of their fancy.

I guess what got me started on this, was the fact we are out of Internet and phone service range for a few days. Yes, imagine the horror. Phones are shut off and stone cold. No updates on the news, no text messages from friends or even the bank. It can all wait a few days and that's OK. There was a world before the Internet - before the personal computer - and before the smart phone or even the analog cell phone.

Surprisingly, it didn't take too long to adapt. You quickly - almost immediately - stop looking at your phone. It doesn't work, shut it off. We've been reading books - old paperbacks for me, Mark perusing the 1,000 titles on a mildly malfunctioning tablet device. We've been kayaking, riding our bicycles, or just sitting on the deck overhanging the water, playing gin-rummy and drinking inexpensive Cava. Life could be a whole lot worse.

Of course, some of our neighbors are still "plugged in" - rarely leaving their campers because of the heat and instead staying inside to watch satellite TeeVee and getting angry over something Tucker Carlson told them to be angry about - in air-conditioned comfort. Maybe it wouldn't be such a bad thing of the Russkies blew up a couple of our satellites. But then again, the Russians love Fox News and vice-versa.

Since we are "offline" I am typing this on my laptop in Word 2000 - a 22-year-old program, nearly as old as my bicycle. I can store it and then cut-and-paste it to Blogger when we get back to what passes for "civilization" these days.

It is funny, but as a kid of 13, I was enthralled by computers, and my elders weren't so sure about them. My Mother would go on drunken rampages and say, "what are all these computers going to be good for, anyway?" And I would try to explain how a microprocessor-controlled engine computer could easily double the gas mileage of that era's cars, while reducing emissions. But that fell on deaf ears.

Most people don't realize how computers allow us to survive this way - by bringing order to chaos, efficiency to waste. But when it comes to the human machine, it seems computers do the opposite - encourage primitive thinking and the cargo-cult mentality. The dumbest ides of humanity - flat earth, faked moon landing, anti-vaxx, conspiracy theories - have more followers today than ever before, and I think that is on a per capita basis as well.

Being "unplugged" from the Internet wasn't such a bad thing. Of course, it was only for three days. And I am sure once we are in cell tower range, our phones will go berserk with text messages and e-mails asking where the hell we've been. (UPDATE, yea, that).

I guess the takeaway from this is two things.  First, if I had trusted my instincts early on, I could have been writing software instead of studying the dead-end job of designing circuits.  The former became a path to riches, the latter merely a commodity.  Maybe both are, today.

Second, nothing on the Internet or in the computer world is permanent, no matter how permanent it look or how hard they try to make it look permanent.

Nothing is constant except change.