Red and Blue: Global Harming

Replying to Brian’s post about taking adaptive measures to global warming. This was too long for a comment.

Let’s say that, for some reason, you find the editor of the Colorado Springs Gazette to be a more credible source than ninety-odd percent of climate researchers, and you decide that the best political course of action is to do nothing until the crisis is obvious. I’ll stipulate that at such time, we’ll have much more evidence than we do now, so it’s therefore likely that we will have a better set of possible solutions at that future date than we do today.

That leaves three problems.

1) I hope you’ll agree with me that there is some level of human suffering caused by climate change today. (Such as the inhabited island that is no longer above sea level.) Your argument can be rephrased as, “there is not enough suffering yet to call this problem a crisis, therefore we Americans (who will be among the last to suffer) will wait until the suffering and death has increased.” Do you have benchmark total in mind for how much suffering and death is required before we agree it’s crisis time?

2) You further presume that there is a linear progression into crisis; unfortunately, climate science tends to disagree with you on this point. You have tipping points, chaotic effects, and runaway processes, such that there are potential effects that outstrip our ability to respond, even presuming continued technological progress. How well would America and Europe respond if the impact is a Dust Bowl cutting off the food supply? Or if England gets dropped into a Siberian temperature zone? Or Boston and Seattle?

3) Finally, note that the same country that you expect to become levee-building ubermenschen is the one that built the levees in New Orleans. At what point do you expect our government to be blessed with such wonderful foresight, and how many cities do you expect to lose in the process?

On the bright side, it will probably be much more scenic to take a boat to Manhattan than the subways, and you’ll be able to get off right at the 34th floor, saving much elevator time.

About the Cult of Macintosh

You can rapidly judge whether an argument is emotional or rational by the amount of backstory that’s necessary to justify it.

This argument will require a lot of backstory.

I’m replying to Brian’s Being All That Apple Can Be essay here, and I can already tell that I’ll spend as much time talking about the “Apple community” and my experience working with Apples (dating back to 1981 or so) as I am going to discuss these nifty new machines that can boot Windows. In fact, in this essay, the Mac community is all I have room to discuss.

We’re Not Zealots, We’re Fanatics

The first thing I’d like to address is the term “zealot”. Yes, Apple users are, well, emotionally involved with their computers. Actually, all computer users are emotionally involved with their computers, and if you don’t think so, then you’ve never seen an undergraduate have a breakdown in a computer lab when his senior thesis got eaten by a power surge.

We’re all human (most of us, anyway), and we anthropomorphize the technologies we rely on. We name our cars. We customize our cell phones. And we chant reassuring incantations to our computers to encourage them to do what we want.

What differentiates Apple users from the superset of all computer users is that we attach our pet concepts to the brand name. I doubt that there’s any computer user on the planet who hasn’t verbally attacked his computer—brand name notwithstanding—when it foiled his plans for the day. But what Apple users have noticed is that we seem to say nice things to our computers more often than the rest of you.

Perhaps that’s no longer true. Perhaps there are thousands of Windows XP users out there who have named their laptop “Strawberry” and who sing metaphorical lullabies to it when it goes to sleep. All I can say is that I haven’t met those people, but I meet their Apple counterparts on a daily basis.

The vast majority of my interaction with “average” computer users is at Starbucks and other public Wifi points. All I can report on is this anecdotal experience. There was once a time when I would frequently be the only Apple user in the store. Today that ratio is closer to 50% or greater. Apple users talk to each other; the glowing bat-signal on the case is a beacon that invites conversation. I’ve seen this rarely with Palm users; never with Windows laptops.

The distinction between zealots and fanatics is that zealots are engaged in religious battles. Fanatics have reasons, however tenuous, for their devotion. The Apple community does have its zealots, no question; arguably, this dynamic was created when the zealots of the 1980s believed that any computer with a graphic interface was a “toy”. But most of us do stick with Apple for sound reasons, and most of us do note when Apple makes a misstep.

Our Relation to the Mothership

There’s no doubt that most computer companies do not have users sticking decals on their cars. Few non-Mac users ever cared about the loss of graphic doodads on their computers like we noted the discontinuation of the rainbow Apple and the happy Mac.

But we also remember, and not with fondness, John Scully and Gil Amelio. We remember the proliferation of beige boxes with incomprehensible numbers and completely different architectures. We remember the twelve different versions of System 7.

Which is why we treat Steve Jobs like a demigod: not because he is the head of Apple, but because he remade Apple into the company we wanted it to be.

And what do we want it to be? Brian accuses us as follows:

Among the most brand-loyal consumers on the planet, the Zealots believe that Apple is a different kind of company.  Nicer.  Purer.  Out for something more than generating profit for its shareholders.  Out to make the world a better place.  The only company on the planet that would willingly forego something profitable for something “cool.” The Luke Skywalker to Microsoft’s Darth Vader. The Ben & Jerry’s of personal computing.

This is almost entirely accurate. Apple isn’t alone in this, either; Ben & Jerry’s does quite nicely on its own corporate benevolence policies, and there are even organizations that promote the idea that turning a profit should not be the be-all and end-all of a corporation, as heretical as that might seem in the halls of Wharton.

Where it is inaccurate is the belief that we don’t care whether Apple turns a profit. You can’t go out today and buy a Timex/Sinclair, or an Amiga, or a SpectraVideo, despite the fact that each of these computers had some rather nifty features. If Apple collapses as a company, then the day comes when we can’t go out and buy a Macintosh. I am seriously invested in using Macintoshes; this is something I care about.

But let’s explore the idea of “cool” for a moment. No, Apple didn’t invent the GUI, but Apple did popularize it. Apple did set the standard for twenty years (and counting) of what a computer should do. Apple also introduced trackballs and palm rests into their laptops. Apple arguably set the stage for Palm devices. Apple was the first to popularize Wifi computing, and the first to build Bluetooth into an entire line of laptops.

Are these merely cool features? Hardly. These are affordances; design choices that allow the average person to do things with technology that were previously impossible. These things did not happen because they were guaranteed to be profitable; they happened because the designers at Apple do think that they are working towards some goal that is higher than the pursuit of profit.

I don’t know what the accountants in 1993 had to say about the profitability of the palm rest design. What I can say, with little fear of contradiction, is that having worked with Apple laptops for 13 years, 10-12 hours per day, seven days a week, I probably owe my lack of a crippling RSI injury to some anonymous industrial designer working at Apple when I was an undergraduate. Now that these are the industry standard, so does nearly every other laptop user.

As Brian points out, Apple enjoys a level of rockstar coverage in the tech world and mainstream press that is far out of proportion to its market share. Is that because the news media has been brainwashed by the Jobs Reality Distortion Field, like we are? Or because it’s generally recognized that when you go to an Apple announcement, you are likely going to see something that makes news, even for non-Apple users?

This is why we give allegiance to Apple. Making the world a better place should not be an accusation.

Safety in (Low) Numbers

Which brings us to the perennial market share argument. A few years ago, I found myself quoted extensively on the Internet with the line, “Yes, it’s true: Windows has 50,000 applications you will never use, while the Macintosh has only 10,000 applications you will never use.” From the user perspective, the market share argument has much the same dimensions.

No question, there are more Windows users out there than Mac users, by some vast number. There are constant arguments about what percentage of people use Macs, since market sales overlook the fact that Macs have longer lifespans than Windows machines.

I’ll leave that aside for now; pick your pundit and run with his numbers. I’ll just return to Starbucks. In Washington DC, New York, and Philadelphia, at Wifi hotspots, the number of Apples has been steadily growing for years. It’s not uncommon to only see Apples in such places. Maybe all the Windows users have desktops. Maybe Windows has complete market domination of the red states. Maybe the Mac users like their laptops more and bring them with them to coffeeshops in greater numbers. Doesn’t much matter; the community is visibly growing and has been for some time.

There are two viewpoints a current Mac user could bring to this phenomenon:

1) They might like being part of a small, special clique, a member of the “rest of us”, and view with some suspicion any move by Apple that will grow the market share quickly.

2) They might just like using Apples and talking to other people who use Apples, and the more, the merrier.

Of course, I’m firmly in the second camp. I make my living selling clever ideas to people who use Macs, and every new Mac user is part of Jeff’s expanded target market. However, all of us in camp 2 share some concerns with camp 1:

1) If Apple expands their market by creating radically different computers (i.e., computers that suck), then since we have to buy those computers eventually, we fear that someday our computers won’t be as enjoyable to use.

2) A flood of new people means people who don’t enculturate into the existing community as smoothly. Cf. the “Christmas modemers” of the late 1980s who changed the nature of many BBS systems, or the AOL onslaught that caused the “death of USENET”. Mac users are self-selected, and so part of why we have a community is because we might share some things in common. Expand that community rapidly, and the commonality fades.

I personally don’t think either is likely; Apple’s next computers are different, but they don’t suck and I don’t expect that to change. And I’ll worry about the community changes that come with larger market share when it happens; that would alter the community, but there will be concomitant benefits.

In my next essay, I’ll cover technical details that Brian brings up, and get into more detail about shipping hardware.

Spending my summer in Boot Camp

Only Nixon could go to China, and only Jobs could give away a means of booting Windows on shipping Macs.

Amidst the vast quantity of misinformed speculation about Apple that has circulated in the last week, two things have reliably occurred:

1) Apple is getting front-page headlines.

2) Pundits are jumping up and down to declare the death or radical transformation of Apple as a company Mac OS.

Suffice to say, as a guy who makes his living using Mac OS, yes, I do have a game plan to learn more about Windows in the next eight months, but not because I’m going to be switching business models. It’s because I think I’m going to have to extend my business model.

Future Directions for Mac OS X

The first point worth addressing is the theory that this will be the death of Mac OS X because developers will only write for Windows and tell Mac users to use the Windows versions of their software. As one website replied, developers could also tell users to hit themselves in the head with hammers.

The existing Mac developer (and consulting) community has two good reasons to support Mac users: it’s profitable, and it’s enjoyable. I suspect that I have the mental chops to become a Windows consultant, but I just don’t like working with Windows the way I enjoy working with Macs. The professional support community won’t voluntarily stop working with Macs due to this quality-of-life issue, and they won’t be forced to make that switch unless working on Macs ceases to be profitable.

(This might be a good time to resurrect a hoary chestnut I’ve been telling for ten years. I did have to stop solely being a Mac consultant a decade ago in favor of being a Mac/Internet/database consultant. My independent Windows colleagues did very well for themselves with a roster of a dozen clients or so; my own similarly-sized roster of Mac clients didn’t pay nearly as well, because Mac clients simply didn’t need professional support as often. I think of this every time I see the phrase “total cost of ownership”.)

The death knell argument goes something like this, to quote an Engadget podcast I listened to recently: Rhapsody, the online music service, is Windows-only. Given that Mac users can “just boot into Windows” to listen to Rhapsody, the service has zero incentive to write a Mac version.

Excepting, of course, that booting into Windows requires shutting down all of the other applications you might be running. You have to really like Rhapsody in order to do that. It’s a viable strategy for mission-critical software, but it’s simply not going to fly for anything of lesser importance. Mac-based businesses that have software like that already have their one PC sitting over in the corner of the office, next to the last typewriter that they use for envelopes; Boot Camp just means that that computer won’t be replaced in the next upgrade cycle.

The ecosystem supporting both Apple and people who make their living on Apple hardware is going to continue apace. What’s changed is that the membrane separating us from the rest of you just became more semi-permeable. That is a fairly major change, but not one that’s going to adversely affect the health of our community. In fact, the more likely outcome is that this will completely change the landscape of the computing industry by 2008.

Windows for the Rest of Us

This is what a multiplatform environment looks like on a Macintosh, as of two weeks ago:

multiplatform thumb.png

Here you’ve got the three major operating system environments, side by side. iTunes is the native Mac software in the upper-left. Windows runs in emulation in its own window (actually, in emulated emulation; that’s a screenshot rather than Virtual PC). In the upper right, I have pan running under X11 using GNOME, which in turn uses the Aqua window manager to make those windows mostly interoperable with other Mac software. You can see the Mac Growl notification popping up in the upper right on top of the X11 window to tell me the newest song playing in iTunes.

If I wanted to, I could bring up a fourth environment, Mac OS Classic, where I could run OS 9 and earlier software, also in their own floating windows much like pan.

There are two interesting things to note about this setup:

1) X11, like Windows, normally ships in its own environment with OS widgets like desktops, file navigation, etc. If you like, you do have the option of turning this back on with Apple’s X11 implementation, and then switch back and forth (without rebooting) between both environments. But as with Classic, Apple shipped the much more useful system of allowing these windows to live side-by-side.

2) In fact, Apple has never shipped concurrent OS software for OS X that forced you to switch into multiple environments. The beta of X11 for Jaguar did require this, but the shipping version with Panther had the option.

Boot Camp, lest we forget, is in beta.

I’ll hasten to add that I have absolutely no idea what would be required to free Windows windows from the tyranny of an enclosing desktop. It might very well be impossible, or at the very least require too much horsepower to be usable. But we’re talking about the people who shipped a version of Unix that your grandmother can use. When it comes to Apple, I tend to redefine my outer limits of what’s possible.

This extends John Gruber’s idea that Macs are no longer different, they’re special. That is, buy an Intel Mac, and you can do anything you could do with a Dell or a Sony. And then some. Side-by-side windowing takes this further. Copy a picture out of iPhoto and paste it into Act!. iSync your Outlook calendar to iCal and publish it to .Mac.

If I really wanted to push this idea, I’d suggest the possibility of using Automator (an AppleScript utility that lets you write programs without knowing a single line of code) and Apple GUI Scripting (a framework that allows AppleScript to work with applications that don’t have their own AppleScript hooks) to give Windows users the ability to automate their software right out of the box, in ways that are impossible on a native Windows-only machine.

(Some of you may have noted that my side-by-side environment contradicts the argument I made earlier about Rhapsody. If this is how it plays out, I still think that Mac software will be written and developed, but it will have to continue to be better than the Windows equivalent. I expect that even in the most highly integrated environment, there will be programming hooks that allow you to do more in Mac native software than with Windows.)

Regardless of what Apple does here, there is one thing that I think is self-evident: Apple is going to do what it can to make Windows on a Mac better than Windows elsewhere. Windows on your MacBook Pro—a branding change that perhaps makes more sense now—is going to blow the doors off your Vaio. Somehow.

2007: A Mac Odyssey

Which brings us to the question of what Apple is going to ship with 10.5, and what it’s going to do to the computer industry.

The first one is a no-brainer: Apple is going to ship configurations that are preloaded with Mac OS X and Windows. After all, other companies are doing this already. And we can presume that Apple’s OEM Windows is going to have critical differences from the stock model, so perhaps with the right support options in place (i.e., same-day on-site service at any Apple Store), it might behoove switchers to buy Apple’s dual-OS system rather than just load in their existing copy.

The reverse case is a bit tougher; I’m trying to decide whether we’ll see an Apple-sanctioned method to run Mac OS on non-Apple hardware. This has also been done already, but there’s a big difference between hacking it together and using a version sanctioned by the mothership. Last year I theorized that Apple could do this by selling cheap copies of Tiger after Leopard is released, but Gruber has me rethinking this with his commentary that Apple makes money selling Macs, not software. I’m further rethinking this because by definition, Tiger will be a second-class experience after Leopard is released, and it’s not Apple’s style to pitch that, even as a loss-leader to entice people to buy Macs next time around.

That being said, it would be trivial to design the next version of OS X so that it does things on Mac hardware that it won’t do elsewhere, and to do the same thing with Apple’s OEM Windows release. (All such DRM would be hackable, but only by the elite; I don’t see this as a market barrier to differentiating Mac hardware by making the software more featuriffic.) So I do still see a market to allow Apple to siphon off the most profitable Windows customers (again, using Gruber’s thinking here) by giving them a dirt cheap way to play with Tiger, in the expectation that they’ll shortly thereafter upgrade their home and SOHO machines to get their hands on Leopard and iLife ’08, or whatever the latest-and-greatest turns out to be.

The requirement here is that Apple can’t be seen as selling a substandard solution for non-Mac hardware. If they think that’s the way it will play in the marketplace, they’ll never sanction this. But if their marketing people—who also have been known to pull off a few miracles—can come up with a way to sell this as the “cheap option which is better than what you have,” and the “better hardware option with the best and most flexible environment on the planet,” then that might be your cue to sell your stock in Dell.

Which brings me to my own game plan, as a Mac guru. I think that it’s a given that at some point shortly, my clients will be using dual-boot environments (at the very least), and it’s a safe bet I’ll be running one myself. (I am really looking forward to playing Half-Life 2 on my laptop.) I have very little doubt that there are individual Windows apps I’d like to use regularly in a side-by-side environment, and it’s my job to recommend to my clients the best tool for their needs. So I’ll be designing a crash course to become Windows-fluent between now and the release of Leopard. I anticipate (and I suspect Apple is anticipating the same thing) that time spent in Windows is going to be 10% pleasant interaction with useful software, and 90% wishing that I were back in my home environment.

But it’s what I think will be necessary in order to hit the ground running when Apple releases 10.5, because no matter what it can and cannot do, it’s definitely true that it’ll contain some interesting surprises.

[The Red and the Blue: Brian Greenberg disagrees with me eloquently and vociferously.]

Independence, vigilance, and liberty

For Independence Day, a few thoughts on our government.

I agree with Brian’s assessment overall on how the O’Connor replacement will go, with a few addenda.

I will continue to give two hoots about John Bolton. I believe John Bolton’s policies will lead to the deaths of millions and the suffering of millions more in the 21st century, just as those same policies had that result in the 20th. Perhaps others have the luxury of seeing John Bolton solely as a political football. They probably sleep better than I do. But unlike them, I will not be distracted by whatever bright, shiny object hits the headlines in the weeks to come.

As for the Supreme Court, unless Bush’s nominee has publicly advocated the slaying of Jewish children to make Christmas eggnog (you know, in order to be fair and balanced over how we make our matzoh), he’s going to be confirmed.

If Bush’s first choice is not confirmed, his next choice will be. This choice will only be marginally less odious, but the Democrats will claim a huge victory. This is what counts as principled opposition in the time of, “Sure, he publicly proclaimed that the US will torture anyone we please, but at least he speaks Spanish.” The same thing will happen in the UN ambassadorship, or the next Supreme Court vacancy.

It’s time for the moderates and the left to wake up and smell the coffee. The forces of tolerance, of moderation, of separation of church and state, all lost along with the Democrats last November. That battle is over. The longer we spend picking over the carcasses, the less prepared we are for future engagements. The right has the presidency, the Congress, many state legislatures, and is closing in on the courts.

So here is what will happen after the Supreme Court is restored to nine justices. Over time, rulings will come down that favor the right. Everyone is watching the abortion decisions, but it’s going to go far beyond that. The conservatives are much happier making their gains quietly, because those wins are the ones that last. And they will make gains. This will set the boundaries for the next round of battles.

Here in America, I’m generally viewed as radical left. Forty years ago, I would have been considered an American moderate. Seventy years ago, with the fascists and the Hooverites on one side, and the communist revolutionaries on the other, I would have been as mainstream as they come.

But today, those same views are radical. Separation of church and state is radical. Belief that American power stems from knowledge and science and equality of education is radical. The idea that our values must be exercised to have meaning is radical. Or so my opponents would have you believe. And to be radical is to be marginalized.

This marginalization does not happen in a vacuum. It happens with the complacence and complicit behavior of the so-called moderates who pretend that these things are not occurring. I believe that many Americans do not want to go down the path that the right is taking us—and yet, we move down that road at a merry pace. Why? Because too many people feel the wind in their hair and put their hats on.

As for the Democrats and the active left: you—we—are failing miserably. The Democratic leadership will share a place in the history of political cowardice with Neville Chamberlain and James Buchanan. The left needs to learn how the modern political game is played, and stop operating under Marquis of Queensbury rules. Your work is for nothing if you do not effect change.

I am an American, and I am a patriot. For my political beliefs, it is common for me to be called a traitor, a supporter of terrorism, a heretic and anti-American. This is now an accepted part of the public debate, and it is allowed because my countrymen allow it to happen. Because the vast sleeping center does not stand up and say that this is not an American mode of discourse. And so long as they remain quiet, the political field will continue to shift, as the right pushes the spectrum further and it becomes necessary to become more conservative to remain safely moderate, where you risk offending no one and can pretend that politics do not matter.

Apathy and silence will be construed as the consent of the governed. History is littered with governments that became theocracies and fascist states by providing fear and anger to the ignorant. On this, the 229th anniversary of our independence, I am unable to understand how any loyal American can wish that to happen here. Or can allow it to happen through inaction.

Perhaps some of you think I am engaging in hyberbole. Perhaps I am. My question to those of you who do: how large does the risk to your nation have to be to get you to act?

But you must remember, my fellow-citizens, that eternal vigilance by the people is the price of liberty, and that you must pay the price if you wish to secure the blessing.—Andrew Jackson, Farewell Address, March 4, 1837

When in the Course of human events, it becomes necessary for one people to dissolve the political bands which have connected them with another, and to assume among the powers of the earth, the separate and equal station to which the Laws of Nature and of Nature’s God entitle them, a decent respect to the opinions of mankind requires that they should declare the causes which impel them to the separation.

We hold these truths to be self-evident, that all men are created equal, that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty and the pursuit of Happiness. —That to secure these rights, Governments are instituted among Men, deriving their just powers from the consent of the governed, —That whenever any Form of Government becomes destructive of these ends, it is the Right of the People to alter or to abolish it, and to institute new Government, laying its foundation on such principles and organizing its powers in such form, as to them shall seem most likely to effect their Safety and Happiness.—The Declaration of Independence of the Thirteen Colonies, In Congress, July 4, 1776

[This essay is part of The Red and the Blue discussion: Supreme Court/Independence Day.]

What the Intel switch really means

Well. This is why I don’t make my living as a fortune teller.

Yesterday, Steve Jobs shocked me and 15,000,000 of my closest friends by announcing that sometime in the next two years our shiny new Macs will be running on the brains of the enemy. Well, not enemy per se. But at least the Vichy CPU. It’s a bit hard to reconcile the anomie that on the one hand, Intel chips are wonderful for us starting in a year or two, but right now they still suck.

Naturally, this has caused the largest outbreak of heated debate on the Internet since the release of Hamsterdance. The net impact this should have on most users is zero; the CPU is safely protected beneath about a dozen layers of abstraction and unless you’re writing software, for most purposes you shouldn’t need to care about this one way or the other. When you board an airplane, do you ask yourself whether the engines are made by Pratt & Whitney? Or is your sole concern that you don’t take a sudden unscheduled stop in Kansas?

But that’s just a rational perspective, which doesn’t ensure that it’s going to be a part of the news discussion. The first AP article on the story—and unfortunately I have to paraphrase here as they’ve rewritten the text—opened with, “taking a risk that could threaten to reduce its already miniscule market share….” The current story leads, “After touting its Macintosh computers as superior alternatives for more than 20 years….”

Well, yes. And we still think the computers are superior, thank you very much. Show me a Wall Street analyst who has had to spend as much time troubleshooting one Windows server as it takes on a dozen Macs, and I’ll be glad to listen to him.

That’s the bottom line. Macs are Macs because of the OS. But perception is reality, and a bloviating pundit on CNBC who thinks he knows Macs because he used one in the Wharton undergraduate labs will get a larger audience than I do.

Still, there are some end-user issues here, no question about it. I know diddly about Intel servers (today), but I’m hearing that they run a lot hotter than PowerPC equivalents, and that’s a problem when you’re building a server closet. The Altivec in the G4 and the G5 can do some really stunning work without hitting the main CPU, and that’s one of the upcoming performance issues.

On the other hand, anyone who can write an emulator that can run the PowerPC version of Office and Photoshop—creme de la creme CPU hogs— on an Intel chip at decent speeds is practicing a form of black magic. That was demoed yesterday. Apple has another year to practice more necromancy, and I’m sure the result will be summarized: “Yeah, we’d like the emulation to be faster, but this is good enough.”

So what does this mean for Mac users? And for people not (yet) using Macs? I’ve spent the last 12 years working with around a hundred Mac-using offices, and I talk to folks in the other camp, and let me tell you—I think I live on a different planet from most Wall Street analysts on a normal day.

First, when the entire personal computing world is running Intel and Intel-compatible chips, the first thought is whether you’ll be able to buy a Mac and install Windows on it as well. Initial reports are that Apple won’t support you if you do, but they won’t try to stop you. So now you can buy that sexy Mac laptop (and here’s where we’ll find out if PowerBooks maintain their eye-catching qualities separate from the OS; my guess is “yes”), boot into Windows at work, and boot into Mac at home. Given the very large number of people I know who are currently dual-OS users on separate hardware, I think this will be a rather common occurence.

Which raises another interesting question. The Mac camp has believed for years that it didn’t matter how much better our OS was, or how much money was burned on protecting against Windows virus attacks; since the only glide path to adopting Mac OS was to switch hardware entirely, few companies were willing to take that level of risk. With Intel Macs, you might see a test group buying Mac hardware and dual-booting; if the Mac OS experiment fails, they simply leave those machines on Windows. If it succeeds, then watch that hardware spread further into the company as hardware is upgraded.

What happens if Mac OS is seen as better? Turns out, for 90% of business functions (which means, let’s face it, Microsoft Office and Outlook), there’s Mac software that does the same job or better. What forces many people to stick to Windows OS is that single application that doesn’t exist on Mac; granted, there are 100,000 such applications, but there are maybe a dozen prime suspects, and it’s very rare for a single user to run more than one or two.

Now let’s consider WINE, which allows Linux users running on Intel chips to launch Windows applications within a Linux environment. Tack something similar into the new Mac OS, and suddenly you don’t have to leave the Mac environment to pop into Act! or your CAD software.

I’ve read one analysis that predicts that this will lead to the death of the Mac. But this could work the other way: perhaps a side-by-side comparison of Mac versus Windows will lead people to stick with the Mac overall environment, and they’ll launch those remaining apps only with great reluctance. Which is much the experience of Mac users today running Virtual PC or a separate Windows PC.

At the risk of condescending to my Windows-using friends, most Mac users believe that Windows users put up with their software because they don’t know any better. During a recent side-by-side web development session, it made my eyes hurt to see how my website is rendered with Windows. If we’ve been right all along, then more people are going to see how the other half (alright, 16 percent) lives.

So far, though, I’ve still just been dealing with Macintosh hardware. The big wildcard is Mac OS on existing Windows hardware. Apple’s official line as of yesterday is, “no, you can’t do that.” Talk about waving a red flag in front of the world’s hacker community, which loves doing what they can’t do. Apple also said, “you can’t install Linux on an iPod,” and look what happened.

Well, actually, Apple never said that, because they probably never thought that anyone would be crazy enough to try. OS X on Windows hardware, heck, that’s a no-brainer.

So given my previous history of thinking that what happened yesterday would never happen, I’ll try to redeem myself with this one: Mac OS X will be running on Windows hardware within weeks of its release. The only question is whether it will require major geek skills to get it running (cf. WINE under Linux), or whether it’s point-and-click (cf. X11 on Mac OS X).

Let’s move ahead a few months. Windows users are clamoring to buy OS X for their hardware. Apple has already started to see if the dual-boot scenario I posited above lets them sell hardware to previously-closed enterprises—and hence allows them to take a smaller hit on hardware sales by not forcing people to buy Mac hardware to run Mac OS. Anyone who wants to can do this anyway if they’re willing to jump through a few hoops (or, ahem, hire a consultant to do it for them). What it takes is for Apple to believe that their hardware can sell itself on its own merits.

What do you suppose Apple’s response will be? Well, you used to need a Mac to buy an iPod.

Then you finally have the true deathmatch—Mac OS on Windows machines, Windows running native on Macintoshes. Microsoft releases Longhorn, Apple releases Leopard (Mac OS X 10.5), and you can try them both out. But most of the world is still running Windows. What does Apple do then?

Tiger will then be the old OS. It will run on Intel machines. So Apple runs the Switcher campaign of all time and you’ll be able to buy it at any Apple retailer for $19.95. Which means that you’ll have to walk into an Apple Store and take the initial sip of the Kool-Aid: look over the new Apple hardware that blows away the features of your current hardware, talk to a few Apple Geniuses, and pick up the materials from what is indisputably one of the best marketing machines on the planet.

Yes, even I think this all sounds crazy. But after yesterday, it’s the path of least resistance.

Coming next: what current Mac owners should do, what prospective Mac owners need to know, and what this means for any professional working on the Mac platform.

[This essay is part of The Red and the Blue discussion: Apple Switches to Intel.]

Menschenhawks

On this, the anniversary of the WTC and Pentagon attacks, I’d like to introduce you to my friend Brian.

I’ve known Brian for around 13 years now. He’s got a lovely wife and an adorable tyke and a baby on the way, and you can see pictures of them all at his website. He’s got a great head for business and an inexhaustible reservoir of good advice which he dispenses to his friends. If you look up the word mensch in a Yiddish-English dictionary, you’ll see his picture. (It means “truly decent guy”.)

Which is why I was struck by his essay about the WTC. You can read it for yourself, and I strongly recommend you do, as it lays out his background for what he has to say. This includes the following, which he thought as he visited Ground Zero for the first time:

Think what you will about the war against terrorism, the war in Afghanistan, or a war in Iraq. At that moment, staring at the burning rubble, it all seemed very clear. It didn’t need to be fair. It didn’t need discussion or debate. We were going to find “them” and destroy “them.” We were going to make sure this kind of thing would never, ever happen again. And we alone were going to determine when we were finished. Period. The rest of the world could help us or get the hell out of our way.

Brian implies, although he doesn’t actually say, that he’s moved on from this visceral reaction. But this crystallized for me the fear I’m feeling about the motivations of the administration, and of the people who are either supporting them or tacitly going along.

It takes a strong individual to face fear, anger, and the urge for revenge, and then move on and say that those were feelings of the moment. There’s no question that Americans are feeling much less secure than they did on September 10, 2001. Some people are downright petrified.

Part of it is fear of the unknown. The big medical news here is West Nile virus, which has killed several dozen people and has everyone nervously checking for mosquito bites. These are largely the same people who try to go to work when they have the flu, which kills around 10,000 Americans a year.

Likewise, the best way for you to not make it home tonight is to get in your car and drive somewhere, which prevented 41,821 people from ever getting home in 2000. Doing some quick math, that’s about one WTC attack per month. Around eight times as many die from cigarette smoking. Therefore, if we were blessed with Vulcan logic, our war on terrorism would be taking a back seat to other matters more likely to kill us.

But obviously, logic has nothing to do with this. If it did, people would always fly instead of drive. Risks are assessed based upon our feeling of control, and driving brings with it an illusory sense of control that you don’t get in the passenger cabin of a 747.

So now our leaders are fighting for their own sense of control, however illusory. We gained it first with our victory over the Taliban, if not Osama. We maintain it with airport security lines, threat assessment color coding, and ongoing statements from the administration that range from the vaguely reassuring to the vaguely terrifying, sometimes in the same sentence. Soon, unless there is a major sea change in the political tide, we’ll be going to war in Iraq to get our next hit off of that very addictive drug.

The reason we want to go kick some Iraqi ass is because all of us had our metaphorical moment at Ground Zero, and few of us have recovered. And if we go, we’ll win; if we doubted that for a second, perhaps we’d be less willing to go. The only question is whether Iraqi casualties will outnumber ours by 10 to 1 or 100 to 1.

The problem is that the true questions of security in the Age of Terrorism don’t get answered by the defeat of nations. No one can say whether defeating Iraq is better in the long run for our safety than not going there in the first place. Sure, Saddam’s a bad man who wants nukes, but we’ve known that since 1989. (Before which time, he was our ally against Iran.) The only thing that’s changed vis a vis our national security regarding Iraq in the past year is that there were rumors that he sorta had something to do with al-Qaeda. Those rumors have been repudiated, but here we are, getting ready to head back to the Persian Gulf and leave it a lot flatter than it is now.

If that someday makes it more likely that we’re targeted by terrorists, and decreases our security, the lines of cause and effect will be far too fuzzy to draw convincingly. If you don’t buy the idea that American actions have some effect on the emotions we arouse in others, no amount of this kind of evidence will ever convince you otherwise.

But if our goal is truly “to make sure this kind of thing would never, ever happen again”, rather than raw vengence or the need to just do something to make ourselves feel better at any cost, we need to start giving some serious thought to what we mean by security, and what we do to sustain it.

[This essay is part of The Red and the Blue discussion: 9/11 Anniversary, 2002.]