Building More Explainable Computing?

The way people experience computing is changing. Anything beginners do with a modern computer is almost inevitably abstracted away from how…

Alasdair Allan
6 years ago

The way people experience computing is changing. Anything beginners do with a modern computer is almost inevitably abstracted away from how it works. The jump from the beginning to building something that looks and feels like something a professional could have built, so much larger.

Over the last decade or more, this trend has been increasingly more evident, as we’ve see a corresponding movement to try and counter it — to explain computing, not just use it. Which is where projects like the DDL4-CPU, a modular 4-bit CPU design, comes in.

The DDL-4 is a modular CPU, with the individual logical units of the CPU broken out into separate boards. Plugging together the colourful boards using edge connectors lets you configure and build a functioning CPU from the logical parts.

Reminiscent of the Mega-One-8-One — a previous build from the same maker that recreated a 74181 bit-slice arithmetic logic unit — the modular DDL-4 is an attempt to make computing more explicable to a modern generation of developers who are increasingly divorced from the underlying hardware that they’re work on.

Both of these projects follow a trend towards “dis-integrated circuits,” perhaps the best known example being the MOnSter 6502, a working transistor-scale replica of the classic MOS 6502 microprocessor from Evil Mad Science.

However, there are other indicators of the same trend. Mirroring the nostalgia for tactile things could be seen in a previous era with the move from crystal sets to transistors, when radio became that bit more opaque, are the super-sized — and sometimes fully functional—boards that have started to appear at Maker Faires over the last couple of years.

Starting with the now somewhat venerable 555 footstool, the over-sized boards have become a fixture at Maker Faire. A lot of the time they’re built using the advertising budget of bigger (and smaller) companies, but I’d argue that they serve the same purpose—or at least the same desire—to make impossibly small and inexplicable things larger, and more tactile.

While there has been a big push over the last few years to making coding more accessible, using a modern desktop or laptop you are insulated from the underlying hardware. Simple techniques like bit banging, and binary bit flags, using individual bits inside a byte as flags, are seen now by many developers as exotic. The everyday tools that most people use to build code today assume so much — windows, layout, graphics — that you’re starting out building on things that people have built, on top of other things, which stretch back into the era of assembly language.

This isn’t a bad thing, but it does leave those of us that started out closer to the hardware worried. Recently, I’ve been giving a series of talks about the history of the Internet, and how the the way we talk to the underlying hardware still matters in a day and age of black boxes.

In an era where the rise in rapid prototyping has led to plug-together hardware along with plug-together code, it’s sometimes important to take a step back and remember how we got here. Because how we got here sometimes can tell us where we’re going, and making computing more tactile can make it more explicable, as well as more magical.

Alasdair Allan
Scientist, author, hacker, maker, and journalist. Building, breaking, and writing. For hire. You can reach me at 📫 alasdair@babilim.co.uk.
Latest articles
Sponsored articles
Related articles
Latest articles
Read more
Related articles