Think about what you’re claiming for a second. The geek market would not have been able to support a single model of a single machine for very long (just look at how the Altair fared after the hype died down). But, there were dozens of highly successful 8 bit micros intended for home use in the US alone (and dozens more in the UK market, and dozens more across the Soviet Union).
The Commodore 64 is still literally the best selling computer of all time, simply because regular people who didn’t identify as geeks were still willing to buy an original-model Commodore 64 in the early 90s. Because both the Lisa and the original Macintosh flopped (i.e., he got fired from Apple before their first profitable GUI machine), literally all of the money Steve Jobs dumped into NeXT and Pixar came from the persistently-profitable Apple ][ line — a line that was marketed as an educational tool for children and a utility for small businesses. Everybody who had one of these was performing simple programming, and enough sold to make millionaires out of Jobs, Woz, Tramiel, Sinclair, Gates, and dozens of others.
Sure, GUI machines eventually became popular, but the percentage of households that had a computer didn’t spike with the release of the Lisa but instead with the first commercial dialup internet services that provided access to the actual internet (as opposed to a BBS) more than a decade later. The Macintosh, Amiga, and Atari ST (the premier GUI machines of the 80s) all consistently had their lunch eaten by 16-bit PCs and the continuing Apple ][ & Commodore 64 lines up until the early 90s, and while Windows theoretically debuted in 1987, it wasn’t in common use among PC users until the early 90s either (because until Windows 3.x it wasn’t much different from the DOSSHELL file manager that had shipped with PC-DOS for years). Even then, Windows didn’t really start being commonly run on home computers until the release of Windows 95, because earlier versions were incompatible with DOOM (and the marketing around Windows 95 focused on being able to run DOOM as much as it focused on having a TCP stack & a web browser).
I know that “GUI machines brought computing to the masses” is a common cliche — it became a very effective bit of Apple marketing about ten years after Apple abandoned it — but it’s never really been true. On a fundamental level, GUIs aren’t more intuitive for totally naive users than command line interfaces are — and the command line interface has the benefit that anybody who can read can learn to use it by typing in examples from a book, while even mouse use must be taught by someone physically present.
The plain fact is, people who would never call themselves geeks or programmers regularly bought computers that required programming, learned a tiny bit of BASIC, and used that tiny bit of BASIC to type in games from magazines and debug them or to help do their taxes or to just screw around, and they complained about it no more than people complain about Facebook’s UI.
Please do a tiny bit of research before repeating obviously-flawed cliches. You call yourself a skeptic, so act like one.