I love my iPhone. I lust after an iPad. My son’s first music player was an iPod. I think the MacBook Air looks cool. I believe Steve Jobs, who exited Apple as CEO last week, is a true American genius who changed our world, and I don’t think I’d get a lot of disagreement on that.
But there’s a dark side to Jobs and the company he co-founded that didn’t get much attention last week in the hundreds of articles that appeared about his resignation, which was apparently for health reasons.
I’m not talking about his personal life, but rather how his “dark side” personality traits–the manipulative control-freak egomania that has been widely reported–had their effect on Apple’s entire approach, helping set the company apart from most of the industry.
Apple’s historic 1984 Super Bowl commercial, in which a colorful female hammer-thrower smashes a screen that has mesmerized its colorless male audience into rigid conformity, introduced the Macintosh computer as a darling of the creative community, the antithesis to IBM’s blue suits.
Ironically, it should be clear to anyone who looks very hard that it is Apple whose entire success has been based on rigid conformity. In short, Jobs-the-control-freak is big on innovation, as long as it’s his innovation.
At the beginning, the Mac and earlier Apple computers were based on closed architecture, meaning most of the parts were proprietary and the operating system was made to work with those proprietary parts and nothing else.
Hacking a Mac was near impossible. Writing an application to run on it required that you use Apple’s software development kit (SDK) and play by Apple’s rules on how the program would work with the Mac operating system and hardware.
IBM, America’s dominant computer manufacturer at the time, was caught flat-footed by the popular success of Apple’s personal computers. Fat and happy with the lucrative business and government market for more-powerful computers, IBM had turned a blind eye on the consumer market.
To catch up, IBM designed its personal computer to use components manufactured by other companies. Even the operating system, DOS, was licensed from a Seattle start-up called Microsoft. The design, for expedience, was essentially open architecture, meaning others could fairly easily copy the design and create a clone.
And many did, with variations and improvements of their own. It seemed as if no two “clones” were alike, and even DOS was available from several different companies with variations. IBM lost all control over standards and, eventually, the marketplace.
It was a chaotic free-for-all for PCs. Anyone could write a program for DOS, and a rich community of “shareware” and “freeware” programmers developed, with literally tens of thousands of inexpensive titles for consumers to choose from—which might or might not run on their machines, given the lack of standards.
On the hardware side, the competition drove prices down. In 1986, I bought my first PC-XT clone, hand-assembled at a mom-and-pop shop on Mission Gorge Road in San Diego, for $1,150, including an amber-text monitor and dot-matrix printer. At the same time, a similarly-equipped Mac was going for more than twice that price.
Macs became known for their ease of use and reliability, but also were expensive and had just a fraction of the number of programs available for them. PCs were known for crashing, but also for being inexpensive, customizable and with many more programs and hardware devices available for them.
PCs, with their price advantage and variety, came to dominate the marketplace.
This all may seem like ancient history except that it cemented the path that Jobs/Apple would take. Although the Mac was eventually forced by the marketplace to become more open and compatible, other Apple products continue, to this day, to be built with closed architecture and proprietary formats.
The iPhone and iPad use applications, or apps, only legitimately available through Apple’s iTunes store. Apps are submitted by developers and reviewed by Apple before they can be put in the store.
In that way Apple can enforce standards, but it’s also been known to stifle or limit apps for no apparent reason other than they compete with Apple or AT&T, use the hardware in an unexpected way or, as in the case of Flash, because Steve Jobs doesn’t like Adobe.
Some of the more interesting app innovations for the iPhone have either shown up as hidden “Easter egg” features first sneaked by Apple’s review, with a way to enable them leaked to users, or through the Cydia store for users who “jailbreak” their iPhones, which means hacking them to be more open.
Android-based phones and tablets, on the other hand, use an open operating system developed by Google. Anyone can modify the Android code to create their own version, and many companies and hackers have. The Android marketplace is open; there is no review required before an app can be offered.
And so now there are many more versions of Android phones, offering innovative features unavailable on the iPhone, and many of the Android phones are less expensive, work with more carriers, etc. Sound familiar?
But not all apps will run on all Android phones, and some believe Google made a mistake by letting standards get out of control. Sound familiar?
Android-based phones are now outselling the iPhone. The Android-based tablet market, just getting its footing this year, eventually will be the same story vs. the iPad.
Does Apple welcome this innovation? Taking advantage of a dysfunctional U.S. Patent Office, Apple is taking on manufacturers with patent infringement suits worldwide, for everything from the simple concept of a tablet computer (first shown in the 1968 film 2001: A Space Odyssey) to the idea of swiping across a screen to unlock a phone or tablet.
It’s unlikely Apple will win in most cases, and in cases where Apple does win, it won’t be long before its competitors find work-arounds that are judged original.
Some see Apple’s actions as an effort to quash innovation, and there has even been speculation that the main reason Google bought Motorola Mobility for $12.5 billion on Aug. 15 was to obtain Motorola’s patents as a defense in case it is sued.
The digital world today would not be nearly as advanced, as innovative and as fun were it not for Steve Jobs and Apple. Conversely, the digital world today would not be nearly as varied, as flexible and as affordable were it not for Apple’s competitors and the open-source/open-architecture community that is Apple’s antithesis.