Thoughts, Writ

TinyCore

2022-05-15

I’ve had a bit of a fascination with CollapseOS, which is what chiefly introduced me to "collapse", permacomputing, and similar things. Its final incarnation is as a Forth that can be run on most z80 and 6600 processors. This is an intriguing thing to accomplish, but after trying seriously to work with it and considering what a collapsed world would look like, I’m firmly convinced it’s the wrong approach.

Collapse

I won’t litigate the whole thing, but these kind of collapseniks believe that the modern world is both unsustainable and soon to end - not due to a zombie apocalypse or a meteor, but a supply chain or economic collapse so severe that it halts advanced manufacturing long enough and thoroughly enough that it can never again be restarted (since the creation of modern semiconductors relies heavily on the previous generation of semiconductors). In that case, the only computers and electronics we would ever have are the ones that we have today. There would never be another generation of phone or processor at any level.

So, for those who want to continue using the advantage computers provide, what’s the best way forward?

Forth

CollapseOS’s answer is a Forth; a very small and expressive language that is both REPL, compiler, and interpreter all at once. It has a miniscule footprint, blazing fast speed, and while its syntax is foreign (it’s a reverse polish notation stack processor), it’s dead reliable. It’s famous mostly for going to space on multiple craft, because of its reliability.

The strategy goes that the z80 and 66006800 processors are everywhere in the modern world - anything with an integrated processor since 1980 probably uses one. They’re cheap as dirt and are therefore everywhere. But they’re not beefy enough to boot a Linux (let alone OSX or windows), so any OS being used is going to look a lot more "integrated" and a lot less "gui". You’d be fortunate to even have a terminal prompt in these situtions. So, scrappers would take bits and pieces from various devices with these processors, and build junkyard computers that are just capable enough to do what you need to do. If the processor dies, just grab another from the scrapyard somewhere and slot it in. Computing would be less of a "find a packaged computer" and more of a "keep a bin of parts" affair.

But, why?

Let’s take a step back and examine the scenario for a second. All advanced manufacturing has ended. What kind of computers are you likely to use? z80s that require extensive soldering, knowledge of their pinouts, finding and manipulating memory units that you also somehow know the datasheets for, whose peripherals (like screens or keyboards) have to be reverse engineered and soldered too? It seems unlikely.

I would expect that people use the billions (if not trillions) of x86/ARM machines out there first. A PC recycle shop has enough parts to put together a hundred such computers, to say nothing of datacenters that are filled to the brim with hardware that is often thrown into a garbage can well before it fails simply because better stuff came out. Why on earth would anyone solder anything when we have so much computing power (and storage!) laying around? There are enough machines to last a century if not more. And their usefulness will be much higher, they can interface with any modern device, they can network, and they can still do gpio if needed.

TinyCore

The immediate concern would be that modern software is just too heavy (and not modifiable) to run on most of the older machines we’d find. What good does it do us to have a 20 year old Pentium if it can’t run anything that we want to run? Well, you’ve got that problem (and worse) with z80’s anyway. At least with this, we can still continue working mostly undisturbed.

But, for truly ancient machines that at this point can barely run Debian, there’s an interesting answer - TinyCore. It’s an older project, and isn’t maintained to the level you’d probably want, but the gist is that it loads entirely into memory. All programs, kernel features, and the home directory are in ram. Persistent storage is only used if explicitly mounted, and you would almost never mount your boot drive. This means that on every boot you only start with the absolute minimum that you need - everything else is loaded or unloaded by you into memory as needed. You can’t screw up an install or get into dependency hell where booting no longer works, because you always come from a safe configuration.

And this means it’s damn small, barely 21MB in memory. Even if you’re using 64mb DDR (1) from like, 1998 with a Pentium III, you’re still well in the clear. While I haven’t got a machine that old, I do have a couple spare Optiplexes from 2008 which can manage by running Lubuntu, but take to TinyCore very well.

Now, TinyCore has its own package format and nothing is up to date, it’s old and basically unmaintained. I wouldn’t seriously hold it up as a messiah. But compared to Collapse, it’s a godsend. It boots, it runs useful software, you can run and compile whatever you like on it (for fun i compiled a couple Go projects using a reasonably modern Go version, and everything worked exactly like you’d expect). It’s a Linux, stripped down to its purest and barest elements. If you’re in a desperate situation, this seems like the most extreme answer you’d want to use.

All site content protected by CC-BY-4.0 license