I mean ... where is there to go from here, right?
Let me repeat my comment from the last post: I bought without the intent or inkling I might make money inadvertently on a fashion for translucent colored desktop computers, or a revolution in portable music players, or the advent of Apple as a major entertainment reseller. Honestly, I don't think Apple had any inkling that its music players would be such a hit, either, but that's getting ahead of the story. We're here about the connection between Apple's software strategy and Apple's source of profit, its hardware business (Macs, iPods, and the new business in phones).
People don't buy computers because they make them taller or longer-legged, or because they are versatile accessories. People buy them because they want certain things done. The things they wanted done in the late '90s produced a number of questions for folks considering buying Apple's products:
- Can I email MS-Word documents to co-workers? (ans: yes, but whether you can create them or read them depends whether your version and the version on the non-Macs you're emailing are using compatible file formats, which despite Microsoft creating both application versions isn't a sure thing)
- Can I open WordPerfect attachments? (ans: they hadn't updated WordPerfect for Mac in years and later discontinued it and until OpenOffice appeared for free you were screwed)
- What will happen to all the software I already own? (ans: those apps will look beautiful on the shelf where they sit. Or you can use an emulator program to run a Microsoft operating system atop a pretend-PC on your Mac, and access anything you like so long as you are patient)
- Do they make games for the Mac? (ans: sure!)
- OK, I mean ... good games? (ans: dude, you're being mean now!) (OK, Myst was developed on the Mac, but they saw their sales coming from the PC and moved development platforms. Id's John Carmack, who was nostalgic about NeXTSTEP, posted in a .plan file when MacOS X appeared that it was worlds better for developers than MacOS 9 and prior and announced plans for simultaneous Mac and NT releases, but eventually panned early MacOS X in comparison to WinNT for development of 3D games. In the late '90s and early '00s the answer wasn't encouraging even though there were a couple of slick titles from Mac developers. Pangea's Nanosaur in 3-D was so cool. But Carmack, no Microsoft apologist, made it clear that games performance on Macs wasn't an evil plot to make Macs look bad, but a result of the performance PowerPC machines actually delivered plus the expected results of the unsurprising and natural preference of developers to spend time optimizing applications to perform best on the platforms with the most sales.)
Steve Jobs obviously saw this was true when he announced the slick new operating system without telling folks exactly when it'd ship. Why did Apple need a slick new operating system?
First, security means more than just crackers trying to turn your machine into a slave-bot spam engine. Security means knowing that the mere act of running imperfet applications on your machine won't compromise the good work of the other applications. (Keeping applications from overwriting the memory in use by other applications -- or by the operating system itself -- is called protected memory and MacOS 9 didn't have it any more than Win95.) Security includes reliability: if your machine can't be depended on to do tasks when told, you can't expect your faxes to go out or your important email to be collected. (Making sure misbehaving, non-sharing applications don't hog all the computing resources and keep your machine from giving time to the applications you want to see run requires someone to play referee, and that 'someone' is the operating system's scheduler, which does preemptive multitasking -- assigning resources to tasks at the will of the referee rather than at the whim of the applications that might be playing on your machine. MacOS 9 didn't have preemptive multitasking, it depended on everyone to share nicely.) Security also means knowing something about the users on your system, and preventing them from monkeying with stuff that's not theirs. On a single-user machine this might seem silly, but it's still important: you don't want to inadvertently screw up important system resources with a mis-key, and you don't want misbehaving (or malicious) applications to be able to do so. You don't want your business letters, your financial records, or your really good school paper to get lost because some dingbat wrote a bad entertainment application. Giving different users (including fictitious users created solely for use by particular applications, like the web server) limited permissions is important on any computer system, to prevent badly-coded (or evilly-coded) software from causing you undue irritation. MacOS 9 pretended to have multiple users, but this was a joke, like multiple users on Win95.
I could go on and on, but the upshot is simple: Apple's old operating system was not secure in any sense that mattered to a developer (who didn't want to reboot from a protected network drive after every application crash in case the application's death throws included botching the on-disk system information on which the developer was depending to be consistent) or a user who didn't want to leave a machine running just one known reliable application all the time. Trust impacts purchase decisions.
Everyone "knows" that Mac users love the interface. What about Mac programmers? It turns out that programmers also have a user interface, it's just not the one customers use. When they sit down to use the developer's tools, they use an application programming interface provided by the system to enable their programs to get stuff done. If your operating system is OpenBSD, that interface is minimalist, designed to expose only the minimum that need be done by the application, and includes no stepladders to get to "frills" users expect, like graphical buttons and text that looks like pretty type. If your operating system is Microsoft Windows NT, your operating system provides all kinds of interfaces to every sort of development interface offered to programmers by Microsoft dating back to the says before Microsoft's operating system could run more than one program at a time, and before user interfaces contained color graphics. The vast variety of programming interfaces gives rise to considerable territoriality among programmers, who have strong views about the suitability of their favored development environment for the kind of work they do.
Upshot: programmers are as picky -- or moreso -- about their programming interfaces as their potential customers are picky about the interfaces presented to them by applications. Modern
programming interfaces that reduce programmer time and support good coding habits and facilitate code maintenance are Good Things™. The programming environment on MacOS 8 and its ilk was reportedly quite good for some things, but it was a 1980s-style procedural rather than object-oriented environment involving lots of functions with few parallels in the programming universe, accessible only through a high-dollar developer's tool kit sold by Motorola for use by its PowerPC customers, which if you read the history books was a small enough list that Motorola bailed out of the chips business. In short: the development tools were miles behind the competition and the programming interfaces involved new ideas shoehorned into an ancient software architecture only its mother could love. Apple wasn't going to seduce developers to write for the platform, it was going to have to pray for fanatics who refused to code for anything else.
This, of course, was a problem.
That's because without developers, there's no software. Without the software, there's little reason to buy the machine. Without the hardware sales, Apple's dead. And all for the want of a horseshoe nail.
So this, folks, is why Apple needs a software strategy if it's to succeed in hardware.
As it happened, Steve Jobs had some experience in the movie business, and lots of Mac users had interest in media creation as photographers. Indeed, despite Apple's scant few percent of market share, the Macintosh platform reportedly yielded to Adobe (the developer of Photoshop, which got its start on the Mac) even in those dark days over a quarter of Adobe's profit. But back to Jobs: he'd had personal experience at Pixar signing purchase orders for machines to do high-end graphics stuff, and he was sure there was money in it. Also, Apple's researchers were able to spot competitors like Avid charging several tens of thousands of dollars for electronic film editing systems that Apple imagined might be done in software on machines it could sell.
Apple bought the Final Cut video editing software -- which depended on special hardware accelerator video cards for real-time effects -- after its development team had bounced from firm to firm unable to sell the product due to intellectual property licensing restrictions in which Microsoft prevented hardware accelerator vendor Truevision from selling poducts that worked with QuickTime (Apple's cross-platform multimedia development environment). Thus, Apple bought Final Cut as a defensive move to prevent a promising Mac application that might sell desktops from being killed off, and with it, Apple desktop sales. Unable to find a buyer, Apple continued developing the tool as a competitor to Avid. Increases in the power of Apple-supplied hardware made more and more effects possible in real time, eventually without the aid of graphics cards that wouldn't be marketed to gamers. Apple's cheap video-editing alternative (compared to the Avid systems) were touted as saving the bacon of small film ventures, and as the software became more capable its comparison to Avid became increasingly favorable.
Apple saw a niche to exploit, though. Apple bought a DVD-authoring application (a Windows app whose customers Apple offered an upgrade path to Macs) to re-brand as DVD Pro so folks might master DVDs on their Macs, using content Apple offered to edit and arrange video. Apple later bought a pro audio application (whose Windows version died, etc.). Apple bought the developers of Shake, the application that made possible the special effects seen in Lord of the Rings, and after issuing a single Windows version update explained how folks could, you know ... migrate to Apple products. Apple cleverly offered cluster computing software for offloading big rendering jobs from the console where Shake was operated, and dropped the costly per-node Shake licensing fee. You just ... heh, heh ... needed to run the thing on Apple's servers ....
By 2002, Apple won an Emmy for Final Cut Pro. Apple, which killed the Windows versions of the product promptly on purchase, hit its millionth Final Cut Pro license sale and reportedly holds nearly half the video editing market, a position well ahead of #2 contender Avid (22%). Apple has added server products to manage video content, rolled out storage hardware and software to hold the media users want to access, and now offers a video editing lineup (storage, image manipulation, sound editing, film editing) that covers pretty much all the bases. Although Apple is still filling in the gaps, and tweaking the products to stay competitive and to leverage Apple's hardware offerings, Apple didn't create any of this.
Apple bought it.
And that's been interesting. After buying NeXT's beloved development environment, Apple didn't build everything it wanted from the ground up. Apple didn't know what it wanted. Apple was simply moving defensively to prevent folks from demolishing Apple's ability to make sales by killing key applications by buying applications that would fill the gaps and keep hope alive.
Apple's buy a good defence software strategy wasn't just unleashed for movie editing. When iTunes was launched by Apple, it was little more than a relabel of the SoundJam application formerly available from Cassady & Greene. Apple didn't yet sell a music player, but Apple saw that folks were using their computers for music. When Apple in 2001 launched a firewire-only iPod, running a licensed Pixo operating system and with support only for its own operating system, Apple was just working a niche it'd discovered. When the niche was threatened with DRM -- and everyone seemed at the time to agree that Microsoft would eventually overtake the field because it was the biggest and had the best development budget and the most financial clout and there wasn't really room for multiple competing standards for devices to license and use -- Apple defensively launched a service to make sure Apple's customers could buy electronic music formats that would interoperate with its products.
Once Apple announced iPod and iTunes for Microsoft's operating system, the addressable market for its music products exploded and Apple began seeing serious sales. It's worth noting that Apple didn't envision this or try to foster it at the outset: Apple was essentially prodded into offering support by third party developers who proved folks would pay money for a way to get the iPod to synch with music collections on computers that ran The Other Operating System. Apple had little alternative to official support if it hoped to maintain quality control over the user experience. After all, user experiences from other vendors can in some cases be pretty poor. Apple's been selling tens of millions of music players a year for a while now, and the latest version of the players have a curious feature.
The latest version, the iPod Touch, runs MacOS X, the operating system Apple migrated folks onto in order to save its hardware business. At first, Apple didn't want to allow third parties to develop applications for the platform (which is the same as the iPhone), but developers revolted and Apple, seeing the possibility of iPhone use in enterprises that needed to deploy applications to workers, relented.
The development environment on this Apple handheld's operating system has an interesting feature: John Carmack wants to develop games for it. And Carmack isn't alone. Since the development tools (that are free) are the same on the Mac and the iPhone, developers attracted to the platform for access to the handheld (or desktop) market find themselves developing software very nearly able to run on the other (the interfaces and input aren't the same; the desktops don't have an accelerometer, for example, and the phone hasn't got a mouse).
So we see Apple's software strategy in action: Apple buys, buys, buys to broaden the market it can address with its products. Apple's new development environment (which is largely unused in these Apple acquisitions) attracts third parties to write the applications folks want to see. And -- and this is kinda interesting, as it's not yet proven itself a winner -- Apple is using its NeXT-derived development environment to build novel applications to address customer needs. (Yes, Keynote is beautiful, though the performance of the 1.0 application was abysmal. Aperture 1.0 was a fiasco, useful mostly as an object lesson on the mythical man-month. Who knows when iWork's document or spreadhseet applications will be ready for prime time?)
Now that we've seen what Apple's done to defensively protect its ability to make hardware sales by promoting key niche software and encouraging third party developers to add value to Apple's platform, Part III of this series will examine where this puts Apple in the competitive landscape and what it means for the future. Remember that stock graph? That's what we need to know: do we hold or cash out?
Now that we've seen what Apple's done to defensively protect its ability to make hardware sales by promoting key niche software and encouraging third party developers to add value to Apple's platform, Part III of this series will examine where this puts Apple in the competitive landscape and what it means for the future. Remember that stock graph? That's what we need to know: do we hold or cash out?
No comments:
Post a Comment