Most people like to ignore what they don’t understand and look for logic at any cost. A lot of them like to believe that every question has a logical answer and that there’s an order to everything that is comfortably neat and tidy based purely on what they like to perceive as empirical evidence. As I’ve probably quoted David Bohm, the great quantum physicist, before, it’s worth reiterating the notion that there is always the problem that a great many people think they are thinking when they are merely rearranging their prejudices. And sometimes, particularly when dealing with so-called pundits, the great unwashed find it difficult to differentiate between the two.

Since Apple’s fortunes have been on the rise for nearly half a decade now, most industry pundits have been quietly eating humble pie after the dire predictions of imminent demise that so many bandied about in the late 80s and early 90s. Even the most devoted sceptics of years gone by have apparently stopped fretting over Apple’s future and have become grudgingly forgiving of perceived past mistakes. And to be fair, pundits reckon there were quite a few.

PC hardliners still bang on about Apple’s refusal to license the Mac OS in the late 80s and a legacy of questionable management, dubious product ranges and a failing market share that lasted until the end of the last century. But since Uncle Steve came riding in on his proverbial white horse, Apple has returned to what it does best: making cool products based on proprietary architectures. And even the most critical hardline PC pundits have had to admit that, today, Apple’s share price is somewhat more than acceptable and the company is still defining what’s cool and fashionable in the larger digital arena. Well, that is most of them.

Recently, a Harvard professor called Clayton M Christensen claimed that Apple’s success is built on a strategy that simply won’t stand the test of time and, once all its clever technology matures, industry standards will emerge, leading to the standardisation of interfaces. In his recent publication, Seeing What’s Next: Using Theories of Innovation to Predict Industry Change, he suggests that there will be major paradigm shifts in the industry that will let other companies specialise in pieces of overall systems, and long-term successful products will become modular. When this happens, the competitive advantage of the early leader, eg Apple, dissipates, and the ability to make money migrates to whoever controls the performance-defining subsystem.

Clayton thinks that during the early stages of an industry, when the functionality and reliability of a product isn’t yet adequate to meet customer’s needs, a proprietary solution, like the iPod, is almost always the right solution because it allows you to knit all the pieces together in an optimised way. Apple may think the proprietary iPod is its competitive advantage, but according to Clayton, this is just temporary. In the future, what will matter will be the software inside that lets users find exactly the kind of music they want to listen to, when and where they want to, with minimal effort.
Well...I thought that’s what Apple already does.

Good old Clayton thinks Apple is facing a ‘fork in the road’ comparable to the fork they ostensibly faced when they chose not to open up the Mac in the 1980s. Or, as he puts it, ‘when they let Microsoft become Microsoft’. He also predicts that proprietary architecture won’t be as dominant as it is now in less than three years and that some ‘industry standard’ type corporation will eventually take it over. So, if Apple doesn’t see the fork in the road and doesn’t open up the architecture and begin trying to be the iTunes inside all MP3 players, according to Clayton, they’re going to have to keep coming up with the next cool thing.

So what’s the problem? Isn’t that precisely what Apple has always done best? Isn’t that an essential element of Jobs’ magick which has allowed the company to defy the belief that every question has a logical answer and that there’s an order to everything which is comfortably neat and tidy and based purely on what pundits like to perceive as empirical evidence? Having done consultancy for a lot of large commercial and government organisations, I have experienced first-hand the damage and misinformation generated by trendy business-speak and disturbingly content-free pronouncements on industry trends and solutions. This is definitely an arena where suits take themselves way too seriously, particularly if the con is good enough to generate a fat sum of money. They will happily think they are thinking – or accept the thinking of someone they’re paying – when they are merely rearranging their prejudices or blindly adopting those of others.

Most of these pundits seem to have missed the growing awareness of the importance of chaos, where sensitive dependence on initial conditions mean that tiny differences in input could quickly become overwhelming differences in output. And as long as Apple manages to keep performing some loose equivalent of the butterfly effect, their stirrings will continue to have a major impact on the evolving
digital marketplace.

I think I’ve pointed out in the past that shallow ideas can be easily assimilated, but ideas that require people to reorganise their picture of the world provoke hostility. In the sphere of non-linear dynamics, I don’t think good old Clayton’s tired reappraisals of Apple’s past failings offer anything new or will even manage to maintain a respectable academic shelf-life, let alone any significant impact on the future of the company. We’ve heard the death knell and read the posturing obituaries before and, if you look back at any of them now, they all look pretty shallow and stupid in all their tabloid sensationalism. True, nothing lasts forever. But I think as long as Apple stays true to it’s own ethos, we’re going to see a lot more interest and innovation before we approach a dust-to-dust scenario. For now, Apple should be content to let those who choose to substitute rearranging their prejudices for original thought to simply do so, while paraphrasing, in the best prankster tradition, Mark Twain’s comment that the reports of their death were greatly exaggerated. MW