Friday, June 27, 2008

On Apple's Software Strategy: Part I

An Apple update today to its "Pro Applications" sends my thoughts to the company's overall software strategy. I do not use the word 'strategy' in the vacuous sense here of "let's make, you know, the apps that people buy and then make bank on it!" I really mean the company's decisions about how Apple will use software (its own software, others' software) to achieve Apple's objectives. Apple's objective may have evolved a bit beyond merely selling boxes, but a look at Apple's revenues show that Macs and iPods dwarf payments Apple is getting from service agreements, software as a service, and even software licenses and upgrades. I submit that Apple's objective is to make money on selling computer hardware.

That being said, what's the deal with the software upgrade? This is a hardware company, right? By comparison, Microsoft is a software company -- Steve Jobs pointed out that before Bill Gates, nobody knew you could build a software company -- and Microsoft has the largest group of developers for the MacOS X platform outside Apple ... and is hiring. With operators like Microsoft around, why does Apple need software, much less a software strategy?

Why, indeed.

When I first became interested in Apple, I heard that Apple was buying NeXT. This was an eye-opener. A few years before, while doing support work for a local Catholic college's computer networks, I realized I was spending most of my time putting out fires on MS-Windows desktops.

(Now, for a little aside. I discovered while writing data-manipulation macros on Excel at a major medical center that when I typed examples verbatim from the instruction manual, I got syntax errors. Yes, there used to be paper manuals. No, I could not get a human when I called for support. No, they didn't have the slightest interest in the bugs I discovered while writing the macros, and I later discovered that folks trying to report errors for APIs offered by NT had to pay triple-digit sums for "support" that required them to wade through several tiers of idiots who didn't comprehend the bug before getting someone who seemed to understand, but would not admit, that Microsoft's API wasn't behaving properly because of bugs. This experience admittedly colors my view on Microsoft products and the likelihood I myself could coax proper function from them. Now, back to our regular programming.)

The fires on the MS-Windows desktops weren't always really obvious to solve, and we ended up reinstalling over the network. This was so common, everyone in the department came to memorize the network drive location where the MSFT install discs were located. There were two classes of computers that didn't seem to need a lot of babysitting once set up.

One was the Macs, but there weren't many Macs, just in the music department, and they weren't tasked with lots of oddball things; the one guy in the support group who dealt with Macs was basically like the Maytag repairman, and he spent most of his time putting out fires on MS-Windows desktops, too. I didn't really have much basis for evaluating the Macs, as my own experience with them had been pretty bad (what's the mouse for? why can't you launch programs with the command line? why won't it launch? you mean if I click it twice it will do something other than twice what it does it you click it once?) but I realized some people really were attached to them. It just wasn't my thing.

The other kind of computer that didn't seem to need much intervention to do as instructed was the kind that was running Unix. There were several Unix machines on the campus, all named for different saints. Basil, for example, was an SGI Indigo with 64MB RAM and 4GB of hard drive. It was ostensibly purchased to handle creation of advertisements and promotional materials, but I think the truth is that (a) nobody at the school really understood how to use then-cutting-edge graphics workstations and (b) the network administrator wanted to play Doom on an enormous SGI monitor with a frame rate he wasn't likely to see on any machine costing less than five figures at the time. The machine was used for web development, though, and probably its graphics hardware sniffed with disdain as it was tasked with cropping photos and preparing images for distribution on the web.

Basil wasn't just a glorified Photoshop box. Basil was also the DNS server for the campus, so every browser trying to find web sites asked to Basil for the IP address that would get them Netscape or Sun to check out the latest new web technology they needed to have immediately. Every student that needed email and fired up Pine in a terminal emulator on their NT stations (yes, there was a day that tools like Pine and Elm were really the dominant means to access email) received this email from Basil, which also ran locally the email sessions the students used to get (and draft, and save, and send) their messages. Basil also (heh) ran the network storage for students, so they could save files they made anywhere on the network. When the world wanted to know something about the University, it was Basil that served their requests -- after handling the image manipulation, Basil also served as the web server.

Well, after isn't really accurate. Basil did all this stuff at the same time, on a single processor, by sharing time slices among processors. Nobody seemed to lose email, nobody (barring a configuration issue on their MS-Windows client) seemed unable to reach network storage,
nobody seemed to be unable to reach the net. The automated scripts Basil ran to snoop the network for misbehavior ran like clockwork, and made little notes in log files for later human digestion. Oh, and while all this was going on, Doom was running full-screen at the highest frame rate money could buy. And it never went down.

While I was trying to learn Unix enough to get by, I realized that the interface was never going to catch on for the masses: one needed facility with cryptic commands and scripting conventions that just wasn't suited to the mass audience. This was the realm of dedicated hobbyists and genuine professionals.

(courtesy of the State of Hawai'i)

My interaction with Basil had been via terminal emulators -- from elsewhere on campus, or from home via my OS/2 Warp box and the modem (well, not the modem IBM sold me; that one wasn't compatible with IBM's operating system -- just with Microsoft's -- I needed to go buy someone else's modem to make it work) and my only phone line. So when I actually sat at Basil and worked from the terminal, I was blown away. The DEC machines I'd used had colored backgrounds behind the many terminal windows you might have up, but one never got the sense one could make the machines do anything using a graphical user interface; the images were just to help you move all the running application windows about as they ran on distant servers. The SGI, though, had folders full of icons one could rearrange and resize with blinding, hardware-accelerated speed, and you could use the desktop metaphors as easily as command line instructions to get results from the machine. The light went on.

Mind you, I wasn't able to do much immediately with the GUI on the SGI, and the SGI expected users to "get" Unix. (Well, Irix, which was SGI's Unix.) But the future was clear: to get a machine that would handle utter abuse with cheer, but which required no special skills ordinary people would be loathe to acquire, there was an obvious recipe. Someone -- a someone with a background in user interfaces that people would accept -- needed to engraft that atop Unix just as Irix had engrafted its X-Windows atop Irix, and folks would get a machine that didn't go to hell nonstop, and would let them do some work.

It wasn't a big leap to imagine Apple might be a company whose user interfaces might be acceptable. The problem was that Apple seemed so full of itself, so convinced of the righteousness of its existing operating system (which lacked protected memory, could not enforce acceptable processor-scheduling behaviors on misbehaving threads, and generally left you at the mercy of the worst programmer whose application happened to be running), and so disinterested in the demands of the real world, that the idea seemed doom to die as a concept.

Heh. But then I heard Apple was buying NeXT. I called my stock broker: what did he know about Apple's financial condition? Nada. Everyone had given it up for dead. Hmm ....

NeXT had made what I imagined to be serious Unix workstations, and it had supported embedding sound and images in emails back when that still looked like science fiction. The fact the machines had been a commercial bust wasn't a fact I really appreciated, but I did know the only plausible reason for Apple to buy a Unix vendor was to leverage Unix into a future product. There was just no alternative. And it had been my vision for what computing should look like.

AAPL had been in the high teens as I chewed on the idea, and then I was told some insider (turned out to be Steve Jobs, before becoming iCEO) had dumped millions of shares and driven the price to about $14. I began to lose my nerve. I wasn't rich ... what if I was wrong? People had been buying junk in the computer market for years, and there was no sign it'd abate. And maybe Microsoft would pull NT into something less sucking than I'd seen.

At length, I bought at $20.35 (this was, I think, two splits ago) on the strength that Apple's Unix acquisition would enable it to launch rack-mounted servers, take over the back-office, and replace costly NT systems with the same TCO argument NT used to displace CLI Unix. I did not have any idea I would make money on candy-colored consumer desktops or portable music players. Had I never sold any shares at all, I'd have done quite a bit better than if I'd traded in and out. Doing this, I missed some good upside.

Why was Apple's software acquisition useful for Apple to sell hardware? To answer that, we need to look at why people buy computers (without software, they're useless) and why people choose to write software for a specific platform and not some competitor (the cost of the tools impacts Linux development; the consistency of the APIs from release to release and support for already-existing software has been a major factor in MS-Windows' endurance; the perceived willingness of users to accept a particular platform has been an issue as people over time variously attempted to deliver solutions on Unix, NT, proprietary embedded systems with names folks might not recognize, etc.), and how all this is related to where the market is going.

I'll try to hit this in Part II. The key to Part I is that Apple needs a software strategy to sell hardware, and without a strategy to protect hardware sales with software, Apple was probably as dead as critics predicted. Apple did turn around the company -- in part, through vicious cost cutting, in part by focusing Apple's marketing message on products Apple could deliver in volume where there was a demand for them, and in part because Apple's evolving software strategy enabled it to chase markets that previously had been completely inaccessible to Apple. Oh, and Apple's software strategy made Apple freer to change its hardware with the times, a definite competitive edge for a hardware vendor interested in achieving either cost advantages or performance edge over competitors looking to sell the same customers a piece of hardware. I'll hit Part II over the weekend.

No comments: