One of the notable side-effects of the very inexorability of technology's progress is that while in the abstract we know that computers used to be more primitive in the past, it's only when you actually sit down and examine vintage tech that you see the progress in real and graspable terms.

The computer above is, of course, one of the LC range, the adorable little "pizza box" style Macs that were Apple's early attempts at making a cheap (or "Low Cost") computer. I love the LCs, and dearly wish I had a working one in good condition, complete with the 12-inch RGB display that sat perfectly on top.

This pairing, incidentally, was the first time I'd seen a Mac in a shop. And since I was used to the crude green-on-black screens of my family's Amstrad PCWs, seeing a nearly perfect photograph reproduced on the Mac's screen--"nearly perfect" only because I can now realize that it wouldn't have been displaying millions of colors--had such a huge impact on me that I can still see the image and remember my wonder two decades or more later.

The thing modern eyes notice about the LC, though, when you flip its lid off, is the chunkiness of all the components. It's not just the big things that are big, either--things such as the hard disk. No, the chips themselves are hefty, thick slabs jutting up from the circuit board, and for all their dizzying complexity inside, I can't help but think they look simple and primitive in part because of the small number of prominent pins, and because there are so few of them.

This, of course, is an artifact of how surrounded we are today by unimaginably more complex systems rather than an absolute truth, but that doesn't make it any less real for me. By way of comparison, look at the circuit board of my 2008 MacBook Pro, the machine that (having reassembled it after taking the below photo!) I'm writing on now. Look at how much tinier and apparently infinitely more numerous the chips are, and at how closely they cleave to the surface.

The irony here, of course, is that my trusty MacBook Pro is itself technically classified by Apple as vintage these days, and that if you want to see the current state of the art in miniaturization, you need to look at, say, the insanely tiny circuit board of the new MacBook or at what goes on inside an iPhone.

Here's maybe a more useful real-world comparison. As you probably know, the Raspberry Pi is a small self-contained computer which, although it has a far greater emphasis on learning and hobbyists than the LC had, was created within the same set of compromises, a focus on producing a cheap, accessible computer. Here it is next to the LC.

In other words, you can look at this and see a bit more of a fair, like-for-like comparison, even if there are still plenty of caveats you could apply, rather than putting a low-cost desktop computer from 25 years ago next to a high-end smartphone from today.

That miniaturization of computers, of course, is driven partly by advances in manufacturing--look at the photo below and you'll see big, human-scale screws and solder points which show LC-era machines were largely assembled by people--but also by the shrinking of individual components.

Look closely at the hard disk, that big black brick that takes up most of the photo below, and you'll see there's a microSD sitting on top of it. That particular microSD is an old 8GB one I scrounged up, but you can now get them in capacities up to 512GB. Half a terabyte is 12,800 times the capacity of that bulky hard disk in the LC, yet the microSD card is smaller, faster, more power-efficient and, although I haven't run the numbers, probably cheaper too.

The sheer scale of that progress staggers me, but it's fun to imagine what will impress us in another 25 years compared to what impresses us now. You know the sort of thing; "Before you were born, I had to make do with a computer that ran at 2.6GHz! Two point six! And yes, gigahertz!"

If I've learned anything about predicting the future in technology, it's that you can't simply look at the market now and then plot the trend in a straight line 10, 20, 50, 100 years out, however tempting and natural that is. For one thing, technologies' progress can speed up or slow down, or more likely, get replaced altogether as the very fundamentals change. It might therefore be the case that, as I joked on Twitter, we'll store ever more vast swathes of data in vanishingly smaller formats in decades hence, but it might be indeed that we don't store anything at all locally, and thus that the thing that sounds quaint to our children's generation isn't "half a terabyte on a device the size of your thumbnail" (as "40MB on a device the size of paperback" or even more extreme examples do to us), but the fact that we used local storage at all.

Might, in fact, the sci-fi writers be right, and computers will miniaturize so far that they disappear altogether? I guess it's possible, but if I live to see it I fully expect to be shouting at my great-grandchildren that in my day, we touched our computers, dagnabit. And then inevitably I'll haul out my LC and try to explain all of the above, which will probably earn me nothing but eye-rolling. Jeez, my nonexistent great-grandchildren are jerks.