The chip that changed my world – and yours

Opinion It lasted 50 years, but history finally claimed it. Zilog has called time on the Z80 CPU. Readers may have owned one in an 8-bit microcomputer or showered coins on one in an early arcade video game.

Zilog to end standalone sales of the legendary Z80 CPU

READ MORE

It was ubiquitous in the early to mid-’80s and popular well into the ’90s. All this is well known to vintage tech heads, but it deserves more than nostalgic memories. It marked the beginning of the great trends that defined the industry into the 21st century, and in one key attribute changed the lives of uncountable hordes of users, including yours truly.

The Z80 succeeded because it was the first mainstream microprocessor to care about compatibility. There were many companies making CPUs when the Z80 arrived, and none were compatible with their competition. A new platform needed new software, and if it didn’t get it, it died.

The Z80, however, was compatible with a rival, Intel 8080. Both supported the early CP/M operating system, which had the majority of the business applications of the time, at least outside the Apple ecosystem, and unlike Apple, CP/M’s creator Digital Research would license its products to anyone. The Z80 wasn’t just 8080 code compatible, it was easier to design a computer around. It eliminated the need for extra circuitry to control dynamic RAM chips, it only needed a single power supply voltage whereas the 8080 needed three, and it had additional instructions to simplify common software functions. Most chip companies at the time assumed their products were good enough to establish their own niche; the Z80 was designed to be attractive to the existing market and grow from there.

The main gamechanger, though, was that it became very cheap. That wasn’t the case early on, when it cost around $60 compared to the 6502’s $25, but by the 1980s, a host of other companies, mostly in Japan, were making their own Z80s for their own business computers. That and four cycles of Moore’s Law cut the price to the point where a complete new Z80-based computer could be bought in the UK for £50. Not far off the price of just the chip four years earlier.

This was the point that the Z80 entered my life, as part of the self-assembly Sinclair ZX81 kit. Home computing was too expensive before then for many, especially with competing demands from high-ticket household items like color TVs and VCRs. Dropping hundreds if not thousands on a personal computer was out of the question for millions. The Z80 changed all that – its suitability for a minimalist design coupled with its own low cost gave it a market at the very bottom that no other CPU could meet. There was no 6502-powered equivalent of the ZX81.

The chip democratized IT for a generation, including mine, and fired our imaginations. Life at the bottom wasn’t pretty – a 1K tape-based monochrome machine with no lower case, sound or graphics doesn’t look like much. But compared to no computer at all, it was magnificent. That simplicity meant huge benefits in learning the machine code at the heart of the computer in place of the built-in BASIC interpreter. That meant learning about memory maps, stacks, clock timing and clean code – skills still invaluable in cybersecurity today. Many thousands taught themselves those in their bedrooms, and many of those became the bedrock of IT thereafter.

Zilog never repeated the success of the Z80. Once IBM picked the combination of Intel’s 16-bit 8086 family and MS-DOS while allowing Microsoft to license it freely, the same mix of compatibility that fired up the Z80 took over at its expense. The successor Z8000 had virtually no sales, and the Z80 lived on as an embedded controller quietly ticking away unseen in miscellaneous electronics. Now, it is gone.

Only it isn’t entirely. As well as a ghostly presence in hundreds of emulators and an untold acreage of attics where teenage possessions are stored, you can make your own in an FPGA logic chip. The original only had 8500 transistors, which means you can find nearly half a billion on Cerebra’s four trillion transistor behemoth device. It’s as simple as a bicycle in the age of self-driving cars. It will be studied, recreated and used far into the future.

What can never be recreated is the time when it brought the knowledge of computing to those desperate to learn. Even conscious efforts to bring basic educational computing to the masses such as the Raspberry Pi are, to some, as complex and unfathomable as any modern desktop PC. That’s fine – nobody wants to go back to the days of building your own TV from an article in Practical Television magazine.

Still, something important has been lost, a chapter closed in the history of our defining technology. Goodbye, little chip. You will not be forgotten. ®

Read More

Rupert Goodwins