I've heard the exasperation of Mac gamers loud and clear over the past few months. System requirements for new games are getting really high. Theories have run all over the map on why, and here's a little bit of background.

First off, if you're convinced that Mac game developers are suddenly incompetent, hate your freedoms for still owning a G5, or are trying to steal your precious bodily fluids, take off the tin foil hat. It's not that. Well, the vote is still out on trying to steal your precious bodily fluids, but I can guarantee you the rest isn't true.

Game system requirements have been steadily growing on the PC side for years, and because we get so many of our games from the PC side of the fence, it's only natural that that's going to spill over to the Mac. It's not necessarily just because graphics are getting more complicated, although that's certainly part of it; games are also increasingly dependent on ever-more complex "middleware" to manage things like artificial intelligence, physics effects and networking, and those capabilities eat up cycles.

Add to that the performance of Apple's own OpenGL graphics drivers, which every 3D game on the market depends on. Apple's certainly aware that its graphics don't necessarily perform at the same level as Microsoft's DirectX drivers do, but it still lags when it comes to making games-specific optimizations that can really help. This is a source of constant frustration to some of the game developers who I speak to.

System requirements for Mac games sometimes scale a bit higher than their PC counterparts. A lot of this comes down to a different set of standards used by some of the Mac game makers, who in many cases try to offer realistic expectations for how games should be played on the Mac, and thus, specify faster equivalent systems.

Then there's the transition to Intel microprocessors. On one hand, this has spawned a new generation of games that we wouldn't have had access to if it weren't for Intel. Think of the Cider-based games that EA has released, for example, made only possible because of technology that requires the Intel chips in order to work. But many of those games exact a heavy price, because translation layer technology like Cider doesn't come cheap in terms of performance.

But it's also expensive to develop games to run on multiple processor architectures, especially architectures as radically different as PowerPC and Intel. It's expensive in terms of development man-hours; man-hours that need to be recouped by additional unit sales.

And "unit sales" are ultimately what it's all about. It may seem logical that you'd want to spread your net as widely as possible to recoup your development investment; that you'd want as many Intel and PowerPC-based Macs to run your game. But the fact is that the older a machine is, the less likely someone is to buy any software for it. So the pool of Macs that a game developer can reasonably target dwindles over time.

What it comes right down to in the end is that it's a continuing problem, and it's not going to get any better, or any easier. I wish there was a "magic bullet" solution to the problem. I'm not the only one -- Mac game developers would love for that to be true, too. But in the end, top-tier commercial games often remain on the "bleeding edge" of Mac system performance, while everyone else gets left behind.