Monday, June 30, 2008

On Apple's Software Strategy: Part II (Buy A Good Defense)

In Part I of this series I discussed my mid-1990s insight that someone with user-interface expertise and the will to do so could replace X11 on Unix and build something of high quality for developers, businesses, and administrators while also offering usability advantages over current alternatives. That part concluded with Apple actually buying NeXT, and my eventually overcoming my fear that I was wrong about a stock everyone (apparently including Steve Jobs, whose liquidation drove the shares to about $14 two splits ago) had given up as dead money. I ended Part I with a slew of questions about why Apple's hardware and software businesses were related and what this might tell us about Apple's trajectory from its current perch, so far above where Apple was a few years ago.

I mean ... where is there to go from here, right?

Let me repeat my comment from the last post: I bought without the intent or inkling I might make money inadvertently on a fashion for translucent colored desktop computers, or a revolution in portable music players, or the advent of Apple as a major entertainment reseller. Honestly, I don't think Apple had any inkling that its music players would be such a hit, either, but that's getting ahead of the story. We're here about the connection between Apple's software strategy and Apple's source of profit, its hardware business (Macs, iPods, and the new business in phones).

People don't buy computers because they make them taller or longer-legged, or because they are versatile accessories. People buy them because they want certain things done. The things they wanted done in the late '90s produced a number of questions for folks considering buying Apple's products:
  • Can I email MS-Word documents to co-workers? (ans: yes, but whether you can create them or read them depends whether your version and the version on the non-Macs you're emailing are using compatible file formats, which despite Microsoft creating both application versions isn't a sure thing)
  • Can I open WordPerfect attachments? (ans: they hadn't updated WordPerfect for Mac in years and later discontinued it and until OpenOffice appeared for free you were screwed)
  • What will happen to all the software I already own? (ans: those apps will look beautiful on the shelf where they sit. Or you can use an emulator program to run a Microsoft operating system atop a pretend-PC on your Mac, and access anything you like so long as you are patient)
  • Do they make games for the Mac? (ans: sure!)
  • OK, I mean ... good games? (ans: dude, you're being mean now!) (OK, Myst was developed on the Mac, but they saw their sales coming from the PC and moved development platforms. Id's John Carmack, who was nostalgic about NeXTSTEP, posted in a .plan file when MacOS X appeared that it was worlds better for developers than MacOS 9 and prior and announced plans for simultaneous Mac and NT releases, but eventually panned early MacOS X in comparison to WinNT for development of 3D games. In the late '90s and early '00s the answer wasn't encouraging even though there were a couple of slick titles from Mac developers. Pangea's Nanosaur in 3-D was so cool. But Carmack, no Microsoft apologist, made it clear that games performance on Macs wasn't an evil plot to make Macs look bad, but a result of the performance PowerPC machines actually delivered plus the expected results of the unsurprising and natural preference of developers to spend time optimizing applications to perform best on the platforms with the most sales.)
In short, everything users want to do turn on software questions. If the machine is lightweight or pretty that might be a plus, but inability to do the things the user demands is a deal killer.

Steve Jobs obviously saw this was true when he announced the slick new operating system without telling folks exactly when it'd ship. Why did Apple need a slick new operating system?

First, security means more than just crackers trying to turn your machine into a slave-bot spam engine. Security means knowing that the mere act of running imperfet applications on your machine won't compromise the good work of the other applications. (Keeping applications from overwriting the memory in use by other applications -- or by the operating system itself -- is called protected memory and MacOS 9 didn't have it any more than Win95.) Security includes reliability: if your machine can't be depended on to do tasks when told, you can't expect your faxes to go out or your important email to be collected. (Making sure misbehaving, non-sharing applications don't hog all the computing resources and keep your machine from giving time to the applications you want to see run requires someone to play referee, and that 'someone' is the operating system's scheduler, which does preemptive multitasking -- assigning resources to tasks at the will of the referee rather than at the whim of the applications that might be playing on your machine. MacOS 9 didn't have preemptive multitasking, it depended on everyone to share nicely.) Security also means knowing something about the users on your system, and preventing them from monkeying with stuff that's not theirs. On a single-user machine this might seem silly, but it's still important: you don't want to inadvertently screw up important system resources with a mis-key, and you don't want misbehaving (or malicious) applications to be able to do so. You don't want your business letters, your financial records, or your really good school paper to get lost because some dingbat wrote a bad entertainment application. Giving different users (including fictitious users created solely for use by particular applications, like the web server) limited permissions is important on any computer system, to prevent badly-coded (or evilly-coded) software from causing you undue irritation. MacOS 9 pretended to have multiple users, but this was a joke, like multiple users on Win95.

I could go on and on, but the upshot is simple: Apple's old operating system was not secure in any sense that mattered to a developer (who didn't want to reboot from a protected network drive after every application crash in case the application's death throws included botching the on-disk system information on which the developer was depending to be consistent) or a user who didn't want to leave a machine running just one known reliable application all the time. Trust impacts purchase decisions.

Everyone "knows" that Mac users love the interface. What about Mac programmers? It turns out that programmers also have a user interface, it's just not the one customers use. When they sit down to use the developer's tools, they use an application programming interface provided by the system to enable their programs to get stuff done. If your operating system is OpenBSD, that interface is minimalist, designed to expose only the minimum that need be done by the application, and includes no stepladders to get to "frills" users expect, like graphical buttons and text that looks like pretty type. If your operating system is Microsoft Windows NT, your operating system provides all kinds of interfaces to every sort of development interface offered to programmers by Microsoft dating back to the says before Microsoft's operating system could run more than one program at a time, and before user interfaces contained color graphics. The vast variety of programming interfaces gives rise to considerable territoriality among programmers, who have strong views about the suitability of their favored development environment for the kind of work they do.

Upshot: programmers are as picky -- or moreso -- about their programming interfaces as their potential customers are picky about the interfaces presented to them by applications. Modern
programming interfaces that reduce programmer time and support good coding habits and facilitate code maintenance are Good Things™. The programming environment on MacOS 8 and its ilk was reportedly quite good for some things, but it was a 1980s-style procedural rather than object-oriented environment involving lots of functions with few parallels in the programming universe, accessible only through a high-dollar developer's tool kit sold by Motorola for use by its PowerPC customers, which if you read the history books was a small enough list that Motorola bailed out of the chips business. In short: the development tools were miles behind the competition and the programming interfaces involved new ideas shoehorned into an ancient software architecture only its mother could love. Apple wasn't going to seduce developers to write for the platform, it was going to have to pray for fanatics who refused to code for anything else.

This, of course, was a problem.

That's because without developers, there's no software. Without the software, there's little reason to buy the machine. Without the hardware sales, Apple's dead. And all for the want of a horseshoe nail.

So this, folks, is why Apple needs a software strategy if it's to succeed in hardware.

As it happened, Steve Jobs had some experience in the movie business, and lots of Mac users had interest in media creation as photographers. Indeed, despite Apple's scant few percent of market share, the Macintosh platform reportedly yielded to Adobe (the developer of Photoshop, which got its start on the Mac) even in those dark days over a quarter of Adobe's profit. But back to Jobs: he'd had personal experience at Pixar signing purchase orders for machines to do high-end graphics stuff, and he was sure there was money in it. Also, Apple's researchers were able to spot competitors like Avid charging several tens of thousands of dollars for electronic film editing systems that Apple imagined might be done in software on machines it could sell.

Apple bought the Final Cut video editing software -- which depended on special hardware accelerator video cards for real-time effects -- after its development team had bounced from firm to firm unable to sell the product due to intellectual property licensing restrictions in which Microsoft prevented hardware accelerator vendor Truevision from selling poducts that worked with QuickTime (Apple's cross-platform multimedia development environment). Thus, Apple bought Final Cut as a defensive move to prevent a promising Mac application that might sell desktops from being killed off, and with it, Apple desktop sales. Unable to find a buyer, Apple continued developing the tool as a competitor to Avid. Increases in the power of Apple-supplied hardware made more and more effects possible in real time, eventually without the aid of graphics cards that wouldn't be marketed to gamers. Apple's cheap video-editing alternative (compared to the Avid systems) were touted as saving the bacon of small film ventures, and as the software became more capable its comparison to Avid became increasingly favorable.

Apple saw a niche to exploit, though. Apple bought a DVD-authoring application (a Windows app whose customers Apple offered an upgrade path to Macs) to re-brand as DVD Pro so folks might master DVDs on their Macs, using content Apple offered to edit and arrange video. Apple later bought a pro audio application (whose Windows version died, etc.). Apple bought the developers of Shake, the application that made possible the special effects seen in Lord of the Rings, and after issuing a single Windows version update explained how folks could, you know ... migrate to Apple products. Apple cleverly offered cluster computing software for offloading big rendering jobs from the console where Shake was operated, and dropped the costly per-node Shake licensing fee. You just ... heh, heh ... needed to run the thing on Apple's servers ....

By 2002, Apple won an Emmy for Final Cut Pro. Apple, which killed the Windows versions of the product promptly on purchase, hit its millionth Final Cut Pro license sale and reportedly holds nearly half the video editing market, a position well ahead of #2 contender Avid (22%). Apple has added server products to manage video content, rolled out storage hardware and software to hold the media users want to access, and now offers a video editing lineup (storage, image manipulation, sound editing, film editing) that covers pretty much all the bases. Although Apple is still filling in the gaps, and tweaking the products to stay competitive and to leverage Apple's hardware offerings, Apple didn't create any of this.

Apple bought it.

And that's been interesting. After buying NeXT's beloved development environment, Apple didn't build everything it wanted from the ground up. Apple didn't know what it wanted. Apple was simply moving defensively to prevent folks from demolishing Apple's ability to make sales by killing key applications by buying applications that would fill the gaps and keep hope alive.

Apple's buy a good defence software strategy wasn't just unleashed for movie editing. When iTunes was launched by Apple, it was little more than a relabel of the SoundJam application formerly available from Cassady & Greene. Apple didn't yet sell a music player, but Apple saw that folks were using their computers for music. When Apple in 2001 launched a firewire-only iPod, running a licensed Pixo operating system and with support only for its own operating system, Apple was just working a niche it'd discovered. When the niche was threatened with DRM -- and everyone seemed at the time to agree that Microsoft would eventually overtake the field because it was the biggest and had the best development budget and the most financial clout and there wasn't really room for multiple competing standards for devices to license and use -- Apple defensively launched a service to make sure Apple's customers could buy electronic music formats that would interoperate with its products.

Once Apple announced iPod and iTunes for Microsoft's operating system, the addressable market for its music products exploded and Apple began seeing serious sales. It's worth noting that Apple didn't envision this or try to foster it at the outset: Apple was essentially prodded into offering support by third party developers who proved folks would pay money for a way to get the iPod to synch with music collections on computers that ran The Other Operating System. Apple had little alternative to official support if it hoped to maintain quality control over the user experience. After all, user experiences from other vendors can in some cases be pretty poor. Apple's been selling tens of millions of music players a year for a while now, and the latest version of the players have a curious feature.

The latest version, the iPod Touch, runs MacOS X, the operating system Apple migrated folks onto in order to save its hardware business. At first, Apple didn't want to allow third parties to develop applications for the platform (which is the same as the iPhone), but developers revolted and Apple, seeing the possibility of iPhone use in enterprises that needed to deploy applications to workers, relented.

The development environment on this Apple handheld's operating system has an interesting feature: John Carmack wants to develop games for it. And Carmack isn't alone. Since the development tools (that are free) are the same on the Mac and the iPhone, developers attracted to the platform for access to the handheld (or desktop) market find themselves developing software very nearly able to run on the other (the interfaces and input aren't the same; the desktops don't have an accelerometer, for example, and the phone hasn't got a mouse).

So we see Apple's software strategy in action: Apple buys, buys, buys to broaden the market it can address with its products. Apple's new development environment (which is largely unused in these Apple acquisitions) attracts third parties to write the applications folks want to see. And -- and this is kinda interesting, as it's not yet proven itself a winner -- Apple is using its NeXT-derived development environment to build novel applications to address customer needs. (Yes, Keynote is beautiful, though the performance of the 1.0 application was abysmal. Aperture 1.0 was a fiasco, useful mostly as an object lesson on the mythical man-month. Who knows when iWork's document or spreadhseet applications will be ready for prime time?)

Now that we've seen what Apple's done to defensively protect its ability to make hardware sales by promoting key niche software and encouraging third party developers to add value to Apple's platform, Part III of this series will examine where this puts Apple in the competitive landscape and what it means for the future. Remember that stock graph? That's what we need to know: do we hold or cash out?

Brit Chick Puzzled Over Drunk College Girls

This article -- a product of the Times, no less -- answers the very question it pretends to ponder, without apparently noticing.

The question:
It was once the preserve of the rugby team, but now female students down more units than boys. Why is the fairer sex drinking so much?
The author seems to find the drinking an irony, rather than an inevitable result, of some facts she's gathered:
Unfortunately for the fairer sex, science is against us when it comes to coping with alcohol.... The combination of our size, enzymes and extra fatty tissue seemingly adds up to a less efficient system for breaking down the booze.
The answer is right in her article:
A survey conducted by the Portman Group in 2005 found that over a third of women surveyed had been sexually assaulted whilst drunk. Almost the same number of women asked had also had unprotected sex after drinking. The latest medical research shows that this number has now almost doubled and unwanted pregnancies and STI’s are a more frequent consequence.
Just to spell it out: Free frathouse booze + naïve fools who think the guys at a university can't be as bad as Dad warned them ==> herds of does, getting slower and more defenseless three times faster than the frat boys operating the keg handles.

Predators don't need to outrun the fastest prey to make a kill -- just the slowest. That's the enormous benefit of being a predator: you can fail again and again and still hunt. Staking out the watering hole doesn't hurt their odds.

The fact that something like 97% (of a series of 148 slightly-trained women who fought male attackers) succeeded suggests that men who assault women aren't looking to enter and win a fight, they're looking for a good victim. The lesson should be relatively straightforward: don't bend over backward becoming a conspicuously more helpless victim.

Luddites: CERN Will Destroy Earth!

From the apparent safety of their underground bunker within filing distance from the United States District Court for Hawaii, critics of an experimental particle accelerator in -- wait for it -- Switzerland alleged the likelihood that the tool's scheduled activation in August would "ultimately result in the destruction of our planet." Operated by CERN, the collaborative physics research center where the world's first web server was programmed and operated after its Tim Berners-Lee published a proposal outlining the workings of hypertext, the supercollider known as the "Large Hadron Collider" would allow experiments on subatomic particles traveling virtually the speed of light. (If you need eight significant digits to tell the difference, you are close enough I will not argue with you.)

The quick version of what supercolliders do is to focus enormous energy on tiny subjects to create circumstances so unstable and short-lived and tiny that one determines what happened, very often, by looking at what happened after and working backward to what must have occurred to yield the things that lasted long enough to be seen. Anything requiring the use of a supercollider to create seems hardly a viable candidate for an immortal world-killing cataclysm. The best we can hope for, basically, is some good film to argue about afterward. Actually keeping some kind of miniscule superparticle or black hole is an idea Crichton might have trouble selling.

The position of the critics seems to be "some knowledge was not meant for Man!" To which I respond that with 6.6% of CERN staff being female, we can just have most of the folks there blink for a minute when the data is printed.

But I joke.

The idea that humans are not fit to know the rules of the game into which they've been born is just silly. The universe appears to conduct itself according to a number of general rules, and where the rules are unclear it's entirely reasonable to conduct experiments to understand what's really happening. The ability to travel hundreds of miles per hour, send email almost as fast as thought (well, sometimes that's not such an asset, is it?), and to produce enough food to sustain the world's billions wasn't a recent gift. Success like this took years of work, each generation building atop the discoveries of predecessors.

To abandon the search into the unknown as hopeless would be such a terrible insult to the memory of those whose dedication and care have brought us so far. While it's not obvious what will be learned through fundamental research, the likelihood that we'll learn nothing of value seems so slight in view of our history that I'm inclined to dismiss it.

It looks like the government's position on this is the same.

We have enough trouble getting well-designed scientific inquiry funded without this lunacy. Maybe later I'll give myself license to vent my spleen on politically-timely issues and how really poor research gets funding while the good work sits around waiting for a sponsor.

Sour Milk: Purchase Misses Target

L likes to buy the best sustenance possible. Gouda from Holland is the tops in cheese according to L. To avoid unnecessary additives L favors milk.

The chocolate must be dark chocolate.

L values quality.


I will avoid the gruesome details of how Borden's gallon of Elsie's Finest, bought at a Super Target on June 27 with a sell-by date of July 2 was, on the very day of purchase, so foul it ruined a meal and forced me to gag on homemade hot cocoa. (Oh, the humanity!)


Suffice it to say that $5.39 doesn't go as far at Target as it used to.

The answer? I get into my Diesel and return to the scene of the crime. En route, I am reminded that the highly-detailed Mercedes literature on the safety features of the car lied about the passenger seat weight sensor's 75-pound setting, as I am beeped at for the affront of putting a jug of milk on the passenger seat while driving a few blocks. The red blur you may see is the seatbelt warning light, a feature which according to the pre-sales literature is supposed to be engaged in the Mercedes E320-CDI when the weight sensor in the passenger seat is triggered, which in turn is described as occurring when it senses 75 pounds of pressure. You are supposed to be able to put your milk on the seat without harassment.


After waiting several minutes to reach the front of the customer service queue to return the milk, I ask Redbeard: "It isn't s'posed to come pre-spoiled, is it?" He says he hopes not, and refunds me via a credit to my American Express Blue card.

I then drive to Whole Foods, by a 365-brand milk gallon for $3.99, and charge that to my American Express Blue. The interesting thing is that I think Target is considered a superstore and only to qualify for the 1.5% cash rebate, whereas Whole Foods is a grocery store, is on AmEx' list of "everyday shopping" locations (like fuel stations), and thus nets ne 5% cash back.

The upshot?

I lose 8¢ in cash back on a $5.39 Target credit, gain 20¢ cash back on a Whole Foods charge, for a net 12¢ cash back atop the $1.40 savings on the milk itself (there's no sales tax on this purchase where I live). Excluding the mileage and time to pull this off -- owing entirely to Borden's transportation or Target's handling of the milk -- I note to myself that the gallon of milk has an effective price that's $1.52 cheaper at Whole Foods.

Later I'll have to post on the credit card reward system game. For now, I'll just keep buying milk at Whole Foods, where food is a core mission and not a lure in the hope you will buy an electronic appliance. Besides, the Whole Foods bakery offers the yummy "Seeduction", and many breads worth making into sandwiches.

And L will be going to Whole Foods anyway.

(For those of you concerned with the poor quality of the photographs accompanying this post, I sympathize: they were taken with a v.1.0 iPhone, whose shutter speed is timed to match the pace of an arthritic turtle and whose motion stabilization simply does not exist. I would suggest a tripod for use with this camera, but the iPhone has no tripod attachment. Perhaps some enterprising person will build a tripod with an iPod Universal Connector so iPhone users can occasionally get a non-blurry picture without first embedding the phone in hardening cement, which has to be bad for the button. The tripod could also, you know, have integrated speakers and a solar recharging feature for backpackers breaking for lunch. Just an idea.)

WALL•E Is About Caring

First, about the short:

Before the film started, someone told be about a Dreamworks flick they'd seen recently. And my mind cast back to that firm's work.[1] So when the ads halted and the short began, I wasn't thinking about Pixar and its output. As the short continued, and I found myself drawn in by little details, I began to think this short is good enough to compare to Pixar shorts. After the thought crossed my mind a few times, and then the curtain fell on the short Presto .... and I realized it was Pixar's work.

This kind of unconscious comparison probably says something both about the quality of Pixar's shorts -- that they form the standard -- and about the quality of Presto. It's a fun view. The all-time best, though, still has to be For The Birds.

Since WALL•E will be getting a bunch of reviews by people excited or revolted by its environmentalist or political content,[2] let me cut through the fog for you and tell you that the film is not a sudden departure from Pixar's longstanding high-quality family-oriented fare. The friendship and loyalty showcased in Toy Story and the exaltation of the family sticking together shown in The Incredibles is matched in WALL•E by a very simple idea: caring. The movie's lesson isn't that humans are bad, or that machines are bad, or that some specific problem has a specific answer -- indeed, it's clear nobody in the whole movie has got all the facts. The characters in WALL•E are so narrowly confined by their roles that they have no perspective on the supposed political or ecological messages the film is supposed to be pushing.

WALL•E is about caring.

The backstory of humans forever abandoning Earth while supposedly departing on a five-year cruise (Gilligan's Island, anyone?) while diligent little robots tidy up the mess everyone made, is not the story or its answer. It merely sets up the humans' victory condition: coming home. Of course, WALL•E is already home ... but he and his cockroach pet are lonely. And thus we know WALL•E's victory condition: companionship. WALL•E's prospect for companionship -- EVE -- turns out to be a career woman, and not easily impressed, and when we see that even accomplishing her directive isn't enough to satisfy her we realize her victory condition is purpose.

WALL•E is about caring: to get home, the humans must re-learn what home is and care enough to overcome the obstacles to getting there; to obtain a purpose, EVE must care about something beyond her pre-programmed directives; and to win companionship, WALL•E must secure everyone else's victory condition. These things aren't easy. It's been centuries since people have been on the starship Axiom, and its inhabitants all take for granted that they should live on a starship being waited on hand and foot by robots that keep them out of trouble; nobody has an objective for which to strive, and nobody cares about anything beyond the next free food offer. (Get it? On the Starship Axiom, apathetic humans take everything for granted?) And EVE has a very important classified directive to pursue, and lethal powers to achieve it; how can WALL•E hope to impress her? WALL•E's quest for companionship seems only more doomed on the crowded starship: he's immediately classified as a foreign contaminant and beset first by cleaners then by more sinister foes hoping to eject him from the vessel and into the heartless depths of space.

WALL•E is a winner not because it's about people thoughtfully saving the Earth through ecology (they think they are going to grow pizza plants for goodness' sake), but because it offers a Pixar-quality presentation of a story about one underdog against the whole of humanity (and their robot masters) to make someone care. It's a lesson so generic everyone can enjoy it. And it's so universal everyone can feel its truth.

Check out WALL•E. It's a love story about living, not surviving -- it's about having something to care about and doing something about it. It's a true story about things that never happened. It's the best kind of story there is.




[1] Where Monsters Inc. was about love and ethics and duty and betrayal and redemption, Dreamworks' ogre-movie was about fart jokes (from the opening scene, no less), short jokes, and gags that were only funny because you liked the movie from which they were cribbed. I'm not saying the flick wasn't a big hit for Dreamworks, or that the long line of sequels we should expect highlighting these characters in the future won't be a financial bonanza, but as I was told an observer once said of Baryshnikov’s thirteen-pirouette trick, "Yes, Misha -- but is it art?" (Baryshnikov was definitely capable of art. As presented, however, the thirteen-pirouette trick when I saw it wasn't so much art as an unquestionably astounding feat of technical accomplishment. Bench-pressing a car might be impressive, but that's not art, either.) I'll not begrudge Dreamworks its profit -- it does entertain -- but it's not the sort of thing I expect will, like Fantasia (and probably Fantasia 2000), still be viewed with awe by a future generation.

[2] The political digs are present ("Stay the course!") but are such a sideshow as to fade into background. Yes, the dufus CEO of the company that's sent everyone on Axiom to escape the irritations of the real world for five years while a (failed) cleanup attempt is undertaken has been given some lines and costuming seemingly calculated to remind one of the current Commander-in-Chief, but the gags are sufficiently understated that the next generation won't think they're missing a joke, they won't notice the issue at all when they view the unaltered film. This can't be said about some of the dated gags pulled in other flicks.

Sunday, June 29, 2008

Cops Troll Craigslist for Flesh

It's unsurprising that police are screening Craigslist for minors selling their bodies. Prostitutes, after all, admit using Craigslist to sell sex.

What I'm less clear on is whether there are some demographic features the police might be identifying, suggesting why folks in jurisdictions where sex trade is illegal would engage in it. Here, I suppose, my background in policy analysis shows its colors. As with efforts to curb illicit drugs, one sees two main trends: enforcement and education. The enforcement angle isn't interested in why, it's interested in who and where and in obtaining convictions. The education angle may or may not have overlap with enforcement. For example, some needle-exchange programs -- a classic example of harm-reduction theory in practice -- have been threatened with enforcement activity as enabling abuse, despite being intended to mitigate disease risk to the broader population and not merely as a benefit to addicts. On the other hand, where the problematic activity isn't illegal, some activists argue it's easier to find and educate the at-risk population into mainstream behavior -- for example, because fear of enforcement might discourage efforts to access harm-reduction programs. So my interest in data, being driven by an interest in understanding the mechanism(s) and issue(s) involved and thus in developing responses, may be of utter inconsequence to those in the best position to collect the information.

And why would one engage in an illegal sex business? Presumably a person could open business in Nevada or Amsterdam, after all, and enjoy both freedom from prosecution and the benefits of a lawfully regulated trade (e.g., legally-enforceable contracts). Whether it makes the business any more glamorous or satisfying is doubtful, but knowing you can call the police if things go amiss -- and get help rather than a lecture and a set of cuffs[1] -- should be worth something.

I suspect -- though I haven't done a study on it -- that no small fraction of those in the sex industry were drawn for reasons that turn on powerlessness. The link above quotes a 14yo who 'worked' Craigslist since the age of 11 saying "I wanted to feel loved. ... I wanted to feel important." An adult open about her sex-industry modeling career explains that she entered it because illness made her unsuitable for ordinary work. It's not worth overlooking that esteem and powerlessness are issues that affect models and performers outside the sex industry. How many teen girls training for careers in dance are encouraged toward undernourishment and smoking by people (teachers, peers, opinion leaders) who make them hate their appearance or doubt themselves?

Of course, I don't want to be misunderstood that poor self-esteem (or simple financial desperation) explains illegal prostitution. There are folks with other explanations for choosing sex careers (I couldn't find the links I had in mind, hmph). I suspect that, beside the slave trade (apparently alive and well in the US, not just port cities, and not just for sex workers), desperate people perceiving no personal alternatives likely comprise a principal portion of the industry most likely to result in the kinds of unhealthy conduct that makes folks who oppose legalization continue to do so. On the other hand, if the reason to outlaw commercial sex is to protect victims rather than dispense punishment upon a class of unworthies, perhaps some effort to design and implement a harm-reduction-model intervention is worthwhile. (While I understand the argument that it's a victimless crime, this isn't the argument advanced by those supporting prosecutions.)

In the US, this is about as politically plausible for the sex industry as for the drug trade.

[1] OK, some folks are into cuffs. And the authors of Lady Cop (lyrics) definitely understood this. However, I'm thinking the reason cuffs are considered kink is that it's not mainstream. (Yet!)

Saturday, June 28, 2008

MP3ft! (MP3 is Theft?)

Resurgence in interest in music, despite a decline in CD sales, has meant a boost in vinyl. The newest hook for the audiophile? Buy the LP, get a free download. Why do you need a "compact" disc when you've already got it as small as MP3? (or FLAC, or MPEG-4, or ....)

Meanwhile, iTunes has sold five billion songs in the last, oh, five years.

Taxing Your Memory

Spain wants to tax your memory.

Ostensibly an anti-piracy tax, it will apply not only to music players but to anything "capable of recording, copying, or storing" pictures or sound that might be owned by someone else. This, of course, includes printers and ink cartridges and the media for your own digital camera. After 18 months of delay due to protest over the tax' reach and the fact there's no guarantee the tax will actually end up in the pockets of the artists the tax presumes are being robbed, it will go into effect July 1.

Spain isn't the first country to do this. The Copyright Board of Canada decided that music lovers' hard-drive-based music players were being used to pirate music, and extended blank media taxes to cover them, as well. Being based on capacity, the taxes that were a tolerable sip from the trough being spent on blank tapes and CDs were a voracious gulp -- up to $25 apiece -- from the flow of funds spent on portable music players. The fees collected in Canada were returned after the tax was overturned by the Canadadian Supreme Court.

The refund raises a peculiar question: if a year of fees collected between December 2003 and December 2004 were still on-hand in May of 2005 to be refunded to firms from which the tax had been collected, who exactly was supposed to be the beneficiary of the tax? It obviously wasn't artists, who never received it.

And this returns us to original questions about the reason d'être for the original tax: if it doesn't replace artists' lost revenues, what does it do?

(One wonders a bit about the beneficiary of the refund: assuming consumers ultimately bore the tax as a built-in cost of their players, the tax refund would just be a windfall for device makers or importers. Apple had to devise a special claims process to make sure customers got the benefit of the refund; others may not have bothered.)

Friday, June 27, 2008

Knowing The Path Isn't Walking The Path

In The Matrix, Morpheus tells Neo that there is a difference between knowing the path, and walking it. This is true. There are examples all the time of folks who have some realization about the real path to what they want or need, but still don't manage to pursue it.

This post is about Bill Gates' recognition that Microsoft isn't on The Path and the dates in question make clear that Gates' recognition and Microsoft's warning from its founder weren't enough to get Microsoft on The Path.

For those of you too tired to read, or who simply prefer radio, I offer KIRO-AM/710's Dave Ross conducting a dramatic performance of excerpts of Bill Gates' 2003 email about trying to purchase downloadable software created by Microsoft. In it, he details the challenges faced by a user trying to buy and download Microsoft MovieMaker and the Digital Plus pack. He resorts to frustrated allcaps in places as he tries to express his exasperation about what he experienced trying to use a web site (Microsoft.com) whose creators never imagined folks would try to buy downloads from Microsoft's "Downloads" page, and the like.

By 2003 I wasn't much of a user of Microsoft's products, but I'll take Bill's word on this one. I gather that XBox has made it easier for folks to buy at least some downloads, but then, XBox has cost Microsoft so much more money than it's brought in, that last year MSFT had to engineer its finances to make the last quarter of calendar 2007 seem to produce an XBox profit. (MSFT accelerated into mid-2007 -- the prior fiscal year -- billions in above-expected warranty-related expenses rather than continue to recognize them as they were incurred, so that the last quarter of 2007 -- while still servicing warranty issues -- didn't have any of those charges booked on the current quarter.) Meanwhile, fierce competition has forced Microsoft to price XBox competitively, in the expectation that its per-unit loss can be overcome with eventual software and services sales. The improved interface and shopping experience MSFT crafts in its entertainment division will hopefully bear fruit elsewhere at the company as it fights heinous user experiences.

The other Bill Gates email from 2003 that attracts attention relates more directly to Microsoft's entertainment division.

When Apple first launched the iPod in 2001, it was a Firewire device without any support for Microsoft's customers. Third parties such as MediaFour offered solutions like XPlay to enable users owning Firewire-equipped hardware running Microsoft operating systems to encode and synch music to Apple's ultra-portable (by then-existing standards) music player. Apple didn't lift a finger to support non-Mac buyers for about half a year, until it was dragged into it kicking and screaming by folks who were insisting they wanted to be customers and would pay a third party for the privilege if they had to. Say what you will about Apple's salesmanship, those folks can take a while to pick up a hint.

What's Microsoft care? It doesn't. Microsoft licenses digital rights management (DRM) technology to several firms offering download services, and everybody understands that once critical mass lines up behind Microsoft's format -- and with Microsoft's development and marketing resources to keep its format's performance and availability ahead of competitors, why shouldn't it? -- Apple's customers will either be locked out of the music market (DRM is coming to a new music disc near you! Old music will die!) or Apple will have to license Microsoft's DRM to enable its customers to continue playing it. Sell all the tape players you want, Stevie-baby -- it's the tapes that make the thing sing.

Then, in 2003, Apple launched the Apple Music Store. Well, they pretty quickly renamed it the iTunes Music Store when they were reminded about their agreement with the Beatles' music label Apple Corps that Apple Computer had to keep the Apple name good and clear of the music business ... but it was the Apple Music Store at first. The Apple Music Store was accessible through iTunes (which before being renamed had previously been known as SoundJam from Cassady & Greene), formerly a computer jukebox like anyone else's: you encoded your music and you organized playlists and you listened and maybe you synched music to a portable player.

Now
you also could buy music.

Not just rent it, which was the going game on Microsoft's DRM platform. Apple had struck a deal -- probably because the licensors saw immediately that Apple's program only ran on Apple's computers and everyone knew how few of those were ever sold -- that allowed folks to buy by the song in addition to buying albums. Apple launched sales-capable iTunes for Macs on April 28, 2003. And Apple customers bought millions of tracks.

But before Apple customers had made it clear they liked the music store, Bill Gates wrote April 30 -- within two days of the launch -- that Jobs had caught Microsoft flat-footed, and needed to be replied with a good Microsoft music solution. In particular, he cited: (1) Jobs' ability to focus on a few things that counted, have folks do the interface right, and market the whole as if it were revolutionary; (2) "a better licensing deal than anyone else has gotten for music" (which he said "is very strange to me. The music companies [already compete with their own stores and yet] Somehow they decide to give Apple the ability to do something pretty good"; therefore (3) "Now that Jobs has done it we need to move fast to get something where the UI and Rights are as good."

Fast forward five years. Microsoft's given up, in essence, on its partners ever getting music sales straight -- and has launched both its own player and its own store. The store won't sell you one track. The store will sell you a bundle of points, usable also for XBox 360 services, which you may not yet know you need, and the points are sold in bundles that aren't divisible by 79, the number of points needed to buy a song (79 points turns out to be 98.75¢). One-click? Change from that point bundle?

Sure, big point bundles protect Microsoft from per-transaction charges that can eat all the profit on a one-song transaction. It doesn't do much to make buying songs a simple joy or an impulse buy, though. It seems to make of song buying the kind of user-hostile maze Gates railed against when he tried to install MovieMaker.

Now that Microsoft's had five years to benefit from Gates' instruction from the helm, one might think Microsoft should have overrun the field. The Microsoft we feared in the mid-'90s would have done it for sure, right? What's changed? And why can't a company with so many really smart engineers (and they are smart, I've met some both before and after they were recruited to Redmond) organize solutions to problems like this?

I have a sneaking suspicion that the increasing size of both Microsoft and Apple is going to make nimble reaction to competitors a bit more challenging. In a few fields, I expect Apple to keep the focus needed to produce high-quality products (developing the operating system and development environment for quality performance on iPhones and near-term-visible hardware, for example), though I believe I already see Apple at the limits of its expertise in other areas (Apple received criticism for its dealings with indie music labels and small bands, and I expect it will have the same problems an order of magnitude worse when it starts selling iPod software through its AppStore.

With iTunes and the AppStore, Apple has a possibility of offering content owners worldwide access to customers, and with that power Apple stands in a position to materially change the relationship between bands and the public (and their crooked managers and labels), between developers and the public (like, the public might find them), and so on. Recognizing Cocoa's strength in localization, a single global software marketplace is an extremely valuable prospective asset -- and a likely asset, if Apple doesn't screw the pooch.

Let's hope Apple doesn't learn the wrong lesson from its old operating systems competitor.

On Apple's Software Strategy: Part I

An Apple update today to its "Pro Applications" sends my thoughts to the company's overall software strategy. I do not use the word 'strategy' in the vacuous sense here of "let's make, you know, the apps that people buy and then make bank on it!" I really mean the company's decisions about how Apple will use software (its own software, others' software) to achieve Apple's objectives. Apple's objective may have evolved a bit beyond merely selling boxes, but a look at Apple's revenues show that Macs and iPods dwarf payments Apple is getting from service agreements, software as a service, and even software licenses and upgrades. I submit that Apple's objective is to make money on selling computer hardware.

That being said, what's the deal with the software upgrade? This is a hardware company, right? By comparison, Microsoft is a software company -- Steve Jobs pointed out that before Bill Gates, nobody knew you could build a software company -- and Microsoft has the largest group of developers for the MacOS X platform outside Apple ... and is hiring. With operators like Microsoft around, why does Apple need software, much less a software strategy?

Why, indeed.

When I first became interested in Apple, I heard that Apple was buying NeXT. This was an eye-opener. A few years before, while doing support work for a local Catholic college's computer networks, I realized I was spending most of my time putting out fires on MS-Windows desktops.

(Now, for a little aside. I discovered while writing data-manipulation macros on Excel at a major medical center that when I typed examples verbatim from the instruction manual, I got syntax errors. Yes, there used to be paper manuals. No, I could not get a human when I called for support. No, they didn't have the slightest interest in the bugs I discovered while writing the macros, and I later discovered that folks trying to report errors for APIs offered by NT had to pay triple-digit sums for "support" that required them to wade through several tiers of idiots who didn't comprehend the bug before getting someone who seemed to understand, but would not admit, that Microsoft's API wasn't behaving properly because of bugs. This experience admittedly colors my view on Microsoft products and the likelihood I myself could coax proper function from them. Now, back to our regular programming.)

The fires on the MS-Windows desktops weren't always really obvious to solve, and we ended up reinstalling over the network. This was so common, everyone in the department came to memorize the network drive location where the MSFT install discs were located. There were two classes of computers that didn't seem to need a lot of babysitting once set up.

One was the Macs, but there weren't many Macs, just in the music department, and they weren't tasked with lots of oddball things; the one guy in the support group who dealt with Macs was basically like the Maytag repairman, and he spent most of his time putting out fires on MS-Windows desktops, too. I didn't really have much basis for evaluating the Macs, as my own experience with them had been pretty bad (what's the mouse for? why can't you launch programs with the command line? why won't it launch? you mean if I click it twice it will do something other than twice what it does it you click it once?) but I realized some people really were attached to them. It just wasn't my thing.

The other kind of computer that didn't seem to need much intervention to do as instructed was the kind that was running Unix. There were several Unix machines on the campus, all named for different saints. Basil, for example, was an SGI Indigo with 64MB RAM and 4GB of hard drive. It was ostensibly purchased to handle creation of advertisements and promotional materials, but I think the truth is that (a) nobody at the school really understood how to use then-cutting-edge graphics workstations and (b) the network administrator wanted to play Doom on an enormous SGI monitor with a frame rate he wasn't likely to see on any machine costing less than five figures at the time. The machine was used for web development, though, and probably its graphics hardware sniffed with disdain as it was tasked with cropping photos and preparing images for distribution on the web.

Basil wasn't just a glorified Photoshop box. Basil was also the DNS server for the campus, so every browser trying to find web sites asked to Basil for the IP address that would get them Netscape or Sun to check out the latest new web technology they needed to have immediately. Every student that needed email and fired up Pine in a terminal emulator on their NT stations (yes, there was a day that tools like Pine and Elm were really the dominant means to access email) received this email from Basil, which also ran locally the email sessions the students used to get (and draft, and save, and send) their messages. Basil also (heh) ran the network storage for students, so they could save files they made anywhere on the network. When the world wanted to know something about the University, it was Basil that served their requests -- after handling the image manipulation, Basil also served as the web server.

Well, after isn't really accurate. Basil did all this stuff at the same time, on a single processor, by sharing time slices among processors. Nobody seemed to lose email, nobody (barring a configuration issue on their MS-Windows client) seemed unable to reach network storage,
nobody seemed to be unable to reach the net. The automated scripts Basil ran to snoop the network for misbehavior ran like clockwork, and made little notes in log files for later human digestion. Oh, and while all this was going on, Doom was running full-screen at the highest frame rate money could buy. And it never went down.

While I was trying to learn Unix enough to get by, I realized that the interface was never going to catch on for the masses: one needed facility with cryptic commands and scripting conventions that just wasn't suited to the mass audience. This was the realm of dedicated hobbyists and genuine professionals.

(courtesy of the State of Hawai'i)

My interaction with Basil had been via terminal emulators -- from elsewhere on campus, or from home via my OS/2 Warp box and the modem (well, not the modem IBM sold me; that one wasn't compatible with IBM's operating system -- just with Microsoft's -- I needed to go buy someone else's modem to make it work) and my only phone line. So when I actually sat at Basil and worked from the terminal, I was blown away. The DEC machines I'd used had colored backgrounds behind the many terminal windows you might have up, but one never got the sense one could make the machines do anything using a graphical user interface; the images were just to help you move all the running application windows about as they ran on distant servers. The SGI, though, had folders full of icons one could rearrange and resize with blinding, hardware-accelerated speed, and you could use the desktop metaphors as easily as command line instructions to get results from the machine. The light went on.

Mind you, I wasn't able to do much immediately with the GUI on the SGI, and the SGI expected users to "get" Unix. (Well, Irix, which was SGI's Unix.) But the future was clear: to get a machine that would handle utter abuse with cheer, but which required no special skills ordinary people would be loathe to acquire, there was an obvious recipe. Someone -- a someone with a background in user interfaces that people would accept -- needed to engraft that atop Unix just as Irix had engrafted its X-Windows atop Irix, and folks would get a machine that didn't go to hell nonstop, and would let them do some work.

It wasn't a big leap to imagine Apple might be a company whose user interfaces might be acceptable. The problem was that Apple seemed so full of itself, so convinced of the righteousness of its existing operating system (which lacked protected memory, could not enforce acceptable processor-scheduling behaviors on misbehaving threads, and generally left you at the mercy of the worst programmer whose application happened to be running), and so disinterested in the demands of the real world, that the idea seemed doom to die as a concept.

Heh. But then I heard Apple was buying NeXT. I called my stock broker: what did he know about Apple's financial condition? Nada. Everyone had given it up for dead. Hmm ....

NeXT had made what I imagined to be serious Unix workstations, and it had supported embedding sound and images in emails back when that still looked like science fiction. The fact the machines had been a commercial bust wasn't a fact I really appreciated, but I did know the only plausible reason for Apple to buy a Unix vendor was to leverage Unix into a future product. There was just no alternative. And it had been my vision for what computing should look like.

AAPL had been in the high teens as I chewed on the idea, and then I was told some insider (turned out to be Steve Jobs, before becoming iCEO) had dumped millions of shares and driven the price to about $14. I began to lose my nerve. I wasn't rich ... what if I was wrong? People had been buying junk in the computer market for years, and there was no sign it'd abate. And maybe Microsoft would pull NT into something less sucking than I'd seen.

At length, I bought at $20.35 (this was, I think, two splits ago) on the strength that Apple's Unix acquisition would enable it to launch rack-mounted servers, take over the back-office, and replace costly NT systems with the same TCO argument NT used to displace CLI Unix. I did not have any idea I would make money on candy-colored consumer desktops or portable music players. Had I never sold any shares at all, I'd have done quite a bit better than if I'd traded in and out. Doing this, I missed some good upside.

Why was Apple's software acquisition useful for Apple to sell hardware? To answer that, we need to look at why people buy computers (without software, they're useless) and why people choose to write software for a specific platform and not some competitor (the cost of the tools impacts Linux development; the consistency of the APIs from release to release and support for already-existing software has been a major factor in MS-Windows' endurance; the perceived willingness of users to accept a particular platform has been an issue as people over time variously attempted to deliver solutions on Unix, NT, proprietary embedded systems with names folks might not recognize, etc.), and how all this is related to where the market is going.

I'll try to hit this in Part II. The key to Part I is that Apple needs a software strategy to sell hardware, and without a strategy to protect hardware sales with software, Apple was probably as dead as critics predicted. Apple did turn around the company -- in part, through vicious cost cutting, in part by focusing Apple's marketing message on products Apple could deliver in volume where there was a demand for them, and in part because Apple's evolving software strategy enabled it to chase markets that previously had been completely inaccessible to Apple. Oh, and Apple's software strategy made Apple freer to change its hardware with the times, a definite competitive edge for a hardware vendor interested in achieving either cost advantages or performance edge over competitors looking to sell the same customers a piece of hardware. I'll hit Part II over the weekend.

China Mobile: Apple Now Speaking Our Language

Apple's insistence that China Mobile agree to revenue-sharing terms reportedly killed iPhone talks in early 2008. But the talks are back on.

But was Apple's position on revenue sharing really so hard? Last November, T-Mobile offered Germans a €999 unlocked phone, which one could activate with any carrier. T-Mobile's alternative was a €399 phone with a 24-month contract. Not to be beaten (and with regulators watching), Orange offered a no-contract phone for €649 to keep France from intervening in its handset-and-service deals. French regulators weren't the only ones concerned about mandatory hardware/service lockups. With arrangements in place with European carriers that could not possibly have involved subscription revenue sharing (just a big one-time payoff), how could Apple really claim it was unable to see its way free of revenue sharing?

When Apple announced a new subsidized phone deal for its upgraded phone (3G, A-GPS, improved headphone jack -- don't laugh, the old recessed headphone jack really made it useless to those of us who might have plugged it into a regular mini-stereo jack and just killed its use as an iPod replacement), the announced price was a subsidized price. That is, the price to the customer after the carrier tossed in the extra bucks to Apple that used to come from the customer directly (or from siphoned service fees). The new price of the phone with contract is a bit higher (e.g., $1,237), which should not surprise customers looking at a phone for $199 ($299 to upgrade from 8GB to 16GB).

Remember when you upgraded to a desktop computer with 64K, and it cost thousands and was a greyscale machine with no pointing device? How times have changed ....

Now that Apple has devised a way to get paid by carriers from funds they expect to glean from subscribers, and this method -- gasp -- looks exactly like every other handset manufacturer's subsidized set scheme -- China Mobile is now happy to talk turkey with the iPhone vendor. Still, "there are practical issues to be resolved."

Like, how much money Apple wants for these things, in a country where T-Mobile's unlocked price last year is easy to confuse with an annual income. Average annual expenditure on consumer goods last year was $543, which I reckon might not include China Mobile's service plan but could easily encompass the subsidized iPhone price announced in North America. The upside: China's middle class is projected to eclipse the entire present population of the United States in the next dozen years. And India's middle class isn't doing too badly, either. Establishing Apple's software platform in those markets may be an investment worth making, if Apple can keep up the relative value of the platform over time.

Thursday, June 26, 2008

Busch Reformulates Alcoholic Energy Drinks


After being pursued by nearly a dozen states' attorneys-general over caffeinated alcoholic beverages with names like "Tilt" and "Bud Extra" for allegedly "marketing its caffeinated alcoholic beverages to minors and misrepresenting the drinks' health benefits," Anheuser-Busch announced that after existing stocks finished selling the company "agreed not to produce, distribute or market any other alcoholic drinks containing caffeine or other stimulants."


But I ask you, would a professional purveyor of addictive toxins market such a product at under-age prospective customers? Recall that the now-defunct Joe Camel logo once enjoyed recognition among six year olds that was higher than ninety percent, and was only nearly matched by Mickey Mouse for frequency of properly matching to its brand. Whatever they were trying to do, they succeeded in growing youth smoking. With the late-1990s death of Joe Camel and the blatant "smoking is cool" message it pushed with its ads and apparel, teen tobacco use declined and has not recovered. Apparently, though, kids are suckers: kids smoke in greater proportion than adults.

From this observer's perspective: iced tea tastes better, is generally cheaper by half or more, and is usually sold with unlimited free refills -- making it both a consumer bargain and a smart move in case you have to drive home or just fight your way to the door. But let is not be said that I am against beer; I wouldn't stand between you and your beer any more than you'd want to be between me and a well-brewed iced tea with lemon and the hint of mint.

Hunter Finds 'Right To Bear Arms'

Anton Scalia, a hunter, wrote for a 5-4 majority that the Second Amendment to the United States Constitution (which states that "the right of the people to keep and bear arms shall not be infringed") is an individual right and not a right held by local governments solely to maintain formally-sanctioned militias if they chose, as opponents urged.

The majority was quick to point out that the right to own firearms, like the right to free speech, is not unlimited. After all, despite that "Congress shall make no law ... abridging the freedom of speech" one can't expect protection when one yells "fire" in a crowded theater or publishes known falsehoods about business competitors to undermine their professional reputations and steal their customers. The longstanding bans on felons' possession of firearms, or the possession of firearms in schools, were offered as examples of reasonable restrictions on the exercise of the right the court found to exist.

Europeans must be rubbing their eyes with disbelief. There is apparently no well-loved tradition of individuals going armed as a solution to violence or tyranny in Europe. (I offered to send a pistol to a friend in Germany when I was in high school, and the friend's parents didn't seem to get the joke. Apparently, Texans are known to be crazy.) In fact, the tradition of self-defense seems to be eroding in some parts of Europe; when Ozzy Osbourne found an intruder in his house, there was public debate whether Ozzy's decision to confront the burglar was wrong -- after all, a man who shot an intruder elsewhere had been jailed (photo of jailed shooter Tony Martin here). From the BBC link:
Ozzy was right, but he could have got into a lot of trouble (like Tony Martin). It seems the police and judiciary are failing to protect the public, perhaps the government wants to further cut costs by encouraging vigilantes?
-- Jason, London
As it turns out, local government hasn't got a an enforceable duty to protect citizens in the United States, either -- even if the state itself helps make sure the peril keeps recurring. Since the state faces no liability for failure to protect the public, pleas for help from citizens in distress need good salesmanship or the genuinely-concerned ear of someone with the authority to effect a solution. In Deshaney v. Winnebago County, was unable to get effective intervention to protect her son from a man whom social services had repeatedly documented to have physically abused the boy, yet was under the state's court order to keep relinquishing custody of the boy to his abuser; when he was hospitalized and left with permanent brain damage, the United States Supreme Court, in a 6-3 opinion by its chief justice, stated that government officers had no duty "to the public for protection against private actors". Maybe someplace the police have a duty to protect the public from harmful individuals, a disarmed population makes more sense.

The Second Amendment's language invites thoughts about public safety, though, and does not so obviously invite thoughts of self-defense. Here it is at length: "A well-regulated militia being necessary to the security of a free State, the right of the people to keep and bear arms shall not be infringed." Has the Second Amendment led to public safety? Some have tried to show a connection between concealed carry laws and violent crime decreases, but confounding variables like regional trends over time independent of laws make this more complex to study. John Lott has made an effort to make the case and presents his results here. Still ... even if personal safety is a happy consequence of the Second Amendment, was that its intended result?

Years ago, Soldier of Fortune ran an article describing a postwar document find evidencing that Japan during the Second World War created detailed plans to invade and occupy the western United States, but abandoned the idea. SOF editors pointed out Japan had received and documented intelligence reports making clear that Americans were too naturally unruly to be effectively pacified in such a fashion. I haven't got the cite, and there are lots of other issues in play (like whether Japan ever held a logistical position able to support such an effort), but if it's true it might be the kind of thing that supports the theory the Second Amendment isn't just a guarantee of a reaction force, but has some preventative power. The presence of large numbers of hunting-quality and military-quality weapons among a population steeped in a mythology of rugged resistance to tyranny is certainly a persuasive reason to suspect the United States is safe from successfully being overrun by a foreign power using traditional military tools, even in the absence of a standing army. (The concept of the nation being redeemed by civilians rather than being defended by a professional military is reminiscent of Red Dawn.) Assuming this all works out to be true, there's a problem: the nation's present enemies, realizing a frontal assault is hopeless, plan nothing that smells of a military attack, but are content to plot murders in an effort to create dismay and expense. (Mere murder, in large scale, has proven to be a very effective tool to get media attention. On the other hand, some enemies take a page from the German effort in World War II, and embarrass the U.S. by undermining its currency directly.)

Hmm. Of what protection would Madison's contemporaries propose individuals avail themselves from such perils? Yet, we have some clue: though it took decades of conflict, significant embarrassments, and no lack of blood, the United States obtained something like peace from the northern coast of Africa in 1815, and with it freedom from demands for tribute, which European allies also stopped paying around 1830.

Note: discharging firearms in celebration is likely unlawful within the city limits.

Wednesday, June 25, 2008

United States to Exxon: Just Kidding!

Today, Justice David Souter led five justices of the United States Supreme Court to take what had been a fairly stark warning against those capable of preventing environmental catastrophes and massive economic and aesthetic damage, and reverse it into a wrist-slap.

One may debate whether Exxon should have been held liable for the damage caused by its drunken officer (had Exxon fired him for alcoholism, would it have been subject to suit under federal antidiscrimination law for failing to accommodate him with a sobriety program?), but that's not what the Supreme Court was about. Instead, it decided in effect to turn the concept of punitive damages on its ear and allow economic damages to serve as a sort of guidepost for total damage awards.

Punitive damages exist for a specific reason: punishment. While the specific kind of evil varies from state to state and from statute to statute, sufficient wrongdoing can give rise to a damages award that is larger than the actual damages caused by the wrongdoer -- not as a lottery award to the injured party, but as a guarantee that other wrongdoers won't decide that it's possible to calculate "actual damages" as a cost of business and intentionally inflict them on the public in the context of ongoing business operations.

Let's imagine the world without punitive damages. A petroleum refinery knows its high-volume plant is behind on maintenance, but it also knows that (a) shutting it down for maintenance will cost a fortune, and (b) the maintenance that needs to be done will take so long they might as well wait until something breaks, because the repair would be about as much delay. The third thing they know is (c) the folks working on the plant site are uneducated manual labor whose incomes are small enough that if they are put out of work -- killed or paralyzed -- that the worst possible cost is pretty modest. They'll just set aside a damages fund from which to fight claims, and after they wear down the broke plaintiffs' bankrupted families, settle them. The genius of this is that since the salaries and pay schedules of all the at-risk employees are known, the only thing the risk management types need work out is the likely number of fatalities caused by the expected plant failure, so they can budget for it in the eventual reconstruction project. That way, everybody can be sure the company's long-term safety is assured and still make bonus this quarter.

Punitive damages are intended to throw a wrench into this calculation by adding an unknown damage variable that is based on the assets of the perpetrator, or the perpetrator's profits, or some other measure unrelated to the damages done but intended to get the evildoer's attention. Again, this isn't a lottery: to avoid punitive damages, you need only exercise appropriate care. You can't stop all industrial accidents, but you can conduct yourself with enough care that you don't need special punishment. Punitive damages are for folks who don't bother to care.

In the case of environmental recklessness, the case that punitive damages are some kind of windfall is even harder to make than if the recipient is some poor ex-employee who can't ever walk again. This is because the injury impacts all kinds of people whose utility and value obtained from the environmental resource is difficult to ascertain. In the case of the Exxon Valdez, the employer (according to the court) knew full well their captain had a history of being on the sauce. The employer also knew the captain had a ship full of crude oil that was well-known to cause costly beach clean-ups and to impair the livelihoods of everyone whos business (fishing? tourism? leisure boating rentals or sales?) depended on water being sort-of clean.

By capping punitive damages at a low multiple of economic damages, the court reassures industrial tortfeasors that their risk management teams can go back to costing out the value of human life and intentionally consuming humans and the environment in the pursuit of near-term financial objectives. And imagine the bonanza when their victim is a socialite whose primary activity is humanitarian work and the support of charitable organizations. She doesn't make a dime. When she dies, her family loses nothing. Killing her is free.

Whee!

It's a great time to be in business in the good ol' U.S. of A.

Thanks, Dave!

UPDATE: Dave's discount wasn't the first Exxon (now ExxonMobil, XOM) has received off the original jury verdict. The award as it stands is about a tenth of that declared proper by the folks tasked with deciding the facts of the case. (from the LA Times)

Selling Apples All Over The World

Apple's entry into the retail business was greeted with scorn. Wasn't Gateway, Apple's superior in direct sales and in total unit volumes, bailing out of retail, chastened?

Apple's performance in the retail space has since been compared to Best Buy and to Tiffany's -- with Best Buy and Tiffany's trailing Apple. Apple's decision to acquire retail expertise before committing to a specific retail strategy turned out to be a good move: it led to good pre-market testing advice, and kept Apple from building stores that would fail like everyone else's.

Given that Apple's retail stores are now recognized as a success, the question becomes: what can Apple do to press its advantages, and when should we become concerned that will Apple run out of room to grow? [1] There's a user on dotMac who offers us some graphic insight how far Apple's pushed its growth (whole data set here):


Significantly to this observer, growth has increasingly included foreign store openings. Foreign sales serve to capitalize on two trends that are worth discussing a little. First, the United States dollar can really suck sometimes, and globally diverse sales are a hedge against both currency issues and against regional challenges. Secondly, though, non-US sales play to a latent strength in Apple's development platform. Cocoa is built for localization. Developers need not test against different language-specific versions of the operating system, as might be a risk on some other platforms, that might require bolt-on language packs that don't help applications' own localization, but can compile once and distribute apps that, when launched, will display the user's identified most favorite language that is supported by the application. Change language preferences and re-launch, and you get the app in a new language. One app for worldwide launch and for worldwide sale to folks wanting English, Spanish, Japanese, Chinese, Korean, French, Italian ... slick, eh? And the more Apple actually has customers demanding all these languages, the more valuable the feature will be. Developers wanting access to all these customers without wanting to make and support multiple different app versions will preferentially write for the platform.

You might think "hey, Microsoft already took first-place in the available-app race; Apple is a bit late." But suppose you were talking about a device in which Microsoft wasn't the leader in either installed systems or in offered applications -- like smartphones? Immediately reaching zillions of smartphone users with your new app, in their native language (assuming you can at least hire someone to translate your label's competently), would be very attractive, no?

Since Apple's smartphone development environment and its desktop development environment are both XCode and OS X, getting support for the desktop after luring developers to build apps for the smartphone seems a fairly short step. Add the worldwide application opportunity, and the possibility that enterprises can distribute a single device with a single set of installed applications in a single version for worldwide use by all personnel requiring any language the enterprise chooses to support with its applications, and one sees an interesting reason to invest in OS X and Cocoa in the future.

And an interesting reason folks might later find themselves persuaded to buy.

As Apple rolls out stores in new countries, it is better able to locally train personnel to support subsequent rollouts in the same country. A look at the pace of rollouts in Euroopean countries seems to bear out this hypothesis, and may inform observers on the likely trajectory of rollouts in China and other places only just now getting stores.

The fact that Apple's development tools run on MacOS X, and are themselves written in Cocoa, suggests that Apple's ability to encourage new developers in each of these markets is superior to that of some platforms whose localized support faces technical barriers. Also: XCode is available without additional charge with every installation of MacOS X, and upgrades are available to anyone with a no-charge, lowest-tier membership in the Apple Developer Connection.

Upshot:
  • Apple retail stores seem strongly-associated with sales.
  • Apple retail stores are being opened regularly.
  • Apple retail stores are reaching countries that haven't got much Apple exposure yet.
  • Apple's platform has features that make it more valuable the more linguistic diversity exists on the platform.
  • Apple's GPS-enabled smartphone looks to sell like hotcakes worldwide when it launches next month.
  • Apple's Mac business stands to benefit from network effects that improve the value of the platform on which the smartphone runs, because it is the same platform.
I intend holding Apple shares through the announcement of quarterly results for the period including the international iPhone launch. I expect to look at the then-existing store rollout data to see whether I am happy or deliriously happy with Apple's international performance. I will be paying special attention to any results we can ascertain from Apple's "App Store", its worldwide portal for selling smartphone applications to the entire international installed base in all supported languages (well, it's only these apps at launch, but iTunes Music Store was at one time just music and no television shows or films, too).

[1] Well, if you are Dell the question becomes whether there's a seat for you on this retail bus. Apparently, though, not bothering to stock merchandise turned out to be a bust. Palm's retail foray didn't pan out, either. Gizmodo's article, hilariously-supported-by-art, asks whether Microsoft will try retail next.

Singing a Sad Song at Starbucks

It was entertaining to see Starbucks state the switch to 2% milk was for the benefit of the public health, like it wasn't starting to feel such pain that it became interested in pocketing 8¢ per gallon over 300 million gallons per year.

Maybe the coup de gras was McDonalds (MCD) being rated by Consumer Reports to have better coffee -- "without flaws" compared to "burned".

Having expanded into music to try to capture spare change from its flow of customers, Starbucks now apparently admits it can't do non-coffee products competitively. Or, at least -- it can't do music. So long, sucka.

The remaining question is: with financially significant competitors like McDonalds entering the gourmet coffee business and offering bistro-style or café-like atmosphere in new-format stores (links here and here), will Starbucks still be able to charge a premium for the same beans McDonalds can process into final products with better consistency and effect?

I'm bearish on Starbucks (SBUX) as a result, but I've not gone short. Betting against crazy people puts you at war with the world, and it's a hard place to be.