Recently in Computer Industry Category

One of the most cogent folks I know, particularly in discussions of publishing and the internet, is Adam Tinworth. I've known Adam through a number of settings, but the one most germane to the discussion is as a business journalist. He's a very, very good one. He's also a fine hand with a fencing iron, I'm given to understand, and as someone who once upon a time stumbled through his share of sabre matches I can respect that, but it's not really a factor in the discussion at hand.

Well, Adam recently blogged about content and paywalls -- touching on the current issues with his usual skill and wisdom. Certainly, the topics he addresses in terms of journalism will resonate with anyone following the somewhat tragic conflict between newspaper cartoonists and web cartoonists. It's a good read.

However, it's not Adam's post, but a comment someone made to him about it that really gets to the heart of the matter. He posted a followup that included that comment, and I've never seen the core disconnect highlighted so well. With Adam's permission, I reproduce it here:

The model you have of your consumer's behaviour is wrong, they aren't using the internet as a way of reading a newspaper, they are using the internet, some of which consists of newspaper content, its a different thing. It was bad enough having to explain this in 1999, I find it a bit surprising it still needs saying in 2009.

That's it. That's the whole shooting match in a nutshell. That's why newspapers that are coming up with new paywall schemes will lose. That's why the internet will win. In the end, the process is inexorable, because the battle is not over content. It is over convenience.

Look at the Encyclopedia Britannica versus Wikipedia. I have had harsh words for Wikipedia in the past, and I stand by them, but I'll also be honest: I use Wikipedia every day. The Britannica, on the other hand, was the encyclopedia of record for much much longer than not only I've been alive but my father's been alive. When the Britannica went CD-ROM, I bought it, and bought a copy for my sister's children. It thrilled me that for a tiny amount of money I had access to this seminal resource.

I wouldn't dream of shelling that money out today, even though I (mostly) trust the Britannica's content above Wikipedia's. The Britannica isn't convenient. I can't just link to it when I'm making references to it. I can't just search it casually from any machine without having to fumble with passwords. It takes effort.

Wikipedia is just there. It is always at hand. It is always easy to reach. And it's far more comprehensive on the kinds of minutia and trivia I really need an encyclopedia for than the Britannica could ever be. Is it a trusted source? No, not really. But it's a great launching point for an investigation if I need a trusted source, and for quick "at-hand" information it's simply unparalleled.

And as a result, several orders of magnitude more people check Wikipedia every hour than check the Britannica website every day. It's not that it's better. It's that it's convenient, when all you want to do is look something up quickly and then get back to the websurfing you were already doing.

I don't know very many people who read a newspaper cover to cover, whether online or on paper. But a lot of people read articles that are germane to them right at that moment. Articles get linked on twitter or Livejournal. Google gathers these things together and points people at them when they're interested. And news sources that accept that they're a brief stopover on one's daily web journey get far more traffic than news sources that make a person jump through hoops to get the news. Bring money into the equation, and suddenly that readership drops by another order of magnitude or two. Robert Murdoch and those like him may assert the value of their goods, and equally assert that content must be paid for, but the only thing they can possibly do is make their content irrelevant to the broader world that's coming.

Let me repeat that.

The only thing paywalls or other direct monetization can do for newspapers or any other topical content is make it irrelevant to the world of the internet age.

Let us say that Murdoch succeeds at making his newspapers secure against Google aggregation and other such things. What happens in that scenario? What does basic capitalism tell us happens in a situation like that? Simply put, someone else develops a product that fills the niche no longer being filled. Some other journalistic organization will step up, develop a model around online advertising or some other thing we haven't even heard of yet, and happily reap the benefits. And let us be crystal clear: that organization might have demonstrably inferior news coverage, and it will not matter. Just like Wikipedia and the Britannica, the convenient Internet stop will trump the more prestigious but less convenient news source.

Let me repeat that.

An inferior news source that is easy to reach and consume on the internet will trump superior news sources that are even slightly harder to reach. Every time.

This is true whether we're talking about the Wall Street Journal or Hi and Lois comic strips -- people are going to gravitate to those things that fit the activities they're already doing. If two newspaper articles -- or comic strips -- are equally available to the online reading public, then the relative merits of one versus the other will determine ultimate popularity. If one article -- or comic -- is freely accessible and the other one requires cumbersome registration or, worse yet, a paid subscription, then the freely accessible one will have monumentally more readers than the other, regardless of their relative quality.

People don't go to the Internet to read The New York Times (with rare exceptions). People go to the Internet, see a reference to a breaking news story, and hit The New York Times for the straight story about it. If the Times isn't available to be read, they won't pay a subscription to read it -- they'll go to the Washington Post, or the Chicago Tribune, or the Miami Herald, or wherever is most convenient. And they will go to news.google.com to get the pointer in question. All that putting a given paper behind a paywall will accomplish is a rerouting of that traffic to the free content available.

Until the day Publishers understand this basic principle, said so well above and expanded upon so clumsily by me, we will continue to have ridiculous wars between print and Internet journalists, cartoonists and all the rest. Those institutions that can innovate, monetize and produce will do okay in the emerging era. Those who can't will become smaller, niche organizations that ultimately will disappear or be consumed by their more successful brethren. If you don't believe me, ask the folks at the Britannica, which has been sold, split apart, rebranded, and retooled any number of times in an increasingly desperate attempt to remain in profit.

Or, if that's not enough, ask the folks at Microsoft Encarta. If, that is, you can get anyone to answer the phone -- which is unlikely, since they closed down entirely in October of this year -- all except the Japanese version, which closes on the last day of December this year.

I know this, for the record, because I read it on Wikipedia.

Screenshotofcrossover
My computer, as I have mentioned, is a pretty sweet 17" MacBook Pro. It does many good things and has a big screen. It is really fast, even though it's over a year old at this pojnt. Its graphics rock. And for 99.7% of my job and 89.6% of my everyday life, it has absolutely everything I need or want. More to the point, it just gets out of my way and lets me work. (I've said it before and I'll say it again: the operating system of your computer is the least interesting part of that computer, for the vast majority of users. People want to use software. The OS just connects you to it.)

But, there's that .3% of my job and 10.4% of my leisure time that just can't get around Windows. Fortunately, there's nothing in my life that requires Vista, so that's something I don't have to cope with just yet. Regardless, I've always had to have a strategy in place to cover the situations where Windows was necessary. In the olden days around the turn of the century, that meant Virtual PC, which on the old PowerPC Macintoshes acted as an emulator -- creating a software version of the x86 processor the software needed. It would work, especially for my job related software, but it was slow -- emulation took processor power and a lot of RAM, and any operation on the 'Virtual' Windows box would need to send a command to the software's emulated processor, which then had to send a command to the actual hardware processor, which would send back its response... yeah. Slow.

Then came Intel based Macintoshes. And from the day they were first proposed the rumors flew -- if these were based on Intel based processors, does that mean these Macintoshes could actually run Windows too? The obvious answer was 'of course,' which led to any number of arguments with the Macintosh Religious Fanatics, who couldn't believe there would be a day that Steve would allow such heresy to run native 'pon the machines of which he granted his blessing and the blessings of St. Wozniak. There was a point where I had a relatively impassioned argument on the subject standing on the expo floor of an educational technology conference. I was debating an Apple engineer, mind, who absolutely swore to me that Apple would never have an official, Apple branded and approved solution for running Windows on an Intel based Macintosh -- the entire idea was absurd and there was no way they would ever, ever do it. Not in a million years.

As I recall, it was less than six weeks before Boot Camp was announced. I've seen that Englneer since then, it's worth noting. He's not shy about eating crow.

Boot Camp was a great innovation. You could take your copy of Windows, and reboot your computer running off that Operating System instead of Macintosh. Wham, bam, boom, you were running Windows. And, with the MacBook Pro, it was worth noting that the hardware ran Windows faster and more smoothly than any other I had used. There is an advantage, you see, in having absolute control over what hardware goes into your notebook computer. Suddenly, drivers... well, work.

Now, Boot Camp was an excellent option. I still use it today when I want to bury myself into City of Heroes and make it real pretty like. But it's not terribly convenient. To run Windows in Boot Camp, you have to not be running Mac OS X. In effect, all that stuff you have running on your computer normally just goes away, while you use your new shiny aluminum Windows XP machine. In fact, Boot Camp actually partitions your hard drive, so that you have a hard drive partition for your Mac, and a different one for your Windows install. They both work swimmingly on the machine, but they don't coexist well.

To that end, there was the next step in the evolution of Windows and the Mac -- virtualization. Unlike the confusingly named Virtual-PC, Virtualization software doesn't create an emulation of a windows based machine. Instead, it's a hardware-based virtualization package. In effect, it takes some of the system resources and makes them Windows instead of Macintosh, and then it launches Windows XP. For all intents and purposes the Virtualized desktop has its own processor, RAM and access to all the devices, even as it runs inside the host Operating System. As a result, it (theoretically) can run software at the speed of the system processor, without emulation lag. You can run Windows and Mac OS X simultaneously. And both of the two major Virtualization packages (Parallels and VMWare) have modes that will make windows programs behave like Macintosh programs, with their software icons in the Dock, no "Windows" window needed and the like.

That was a better solution for many things -- certainly, most Windows programs run more smoothly. But some stuff -- especially games -- didn't work well or at all, especially when high end graphics were involved. And while you can set these systems to use your boot camp hard drive as its source for Windows, you had to activate Windows a second time anyway, and all too often that meant calling Microsoft to explain that no, you're not trying to run one copy of Windows on two computers -- you have a Macintosh and....

...which is the obvious 800 gorilla in the room. For any of these solutions to work, you had to buy Windows.

Look, I like Macs, but I don't hate Windows XP. I've been using various forms of Windows since the Windows 3 era. But let's be frank -- if you're lucky Windows XP costs $120, all for an operating system that will sit inside a virtualizer to run those few programs that you, the Mac user, can't otherwise use on a Macintosh.

Fortunately, there are Smart People in the world, and those smart people figured out a while ago that they didn't want Windows features on their machines, they just wanted to run some Windows software. Now, most of those smart people were using Linux, and they all got together and launched the WINE project. The WINE project seeks to create alternate but Windows-compatible shared libraries for Unix derivative systems.

Which means, in effect, that systems that can run WINE can run at least some Windows software without Windows. You can see where we're going with this, right?

At this point, I use Crossover and Crossover Games. Crossover is designed for standard Windows software, and uses the most current stable release of WINE. Crossover Games is... er... used for games, and it's got the most advanced -- and less well tested -- bits of WINE in it to enable gameplay. Supported games, like Half-Life 2 and EVE Online, actually play pretty damn well.

Crossover lets me do probably 95% of my work related WindowsFu, which is more than enough for my purposes. Once in a very great while an esoteric configuration program needs the real deal, but it's rare. As for Crossover Games... well, it lets me launch and run City of Heroes, which is pretty much the only game I can't get for Macintosh to begin with. It's not perfect -- it's Unsupported, which means they haven't developed a specific build for it, and there's stuff it gets hung up on, but for most play it works pretty damn well. Which means I can have a CoH window open while doing other stuff. (As the screenshot above shows -- if you click on it, you'll get a full sized version.)

What this means in the longer run is an open question. Microsoft isn't exactly happy with the existence of WINE and it's various forks. There's reports that Windows Genuine Advantage -- the system that prevents updating programs with pirated Windows software -- specifically blocks WINE. Their word is that WINE is by definition not "genuine Windows," which is true enough, though it's also not pirated. (For the record, it's not illegal to reverse engineer software. Which is how Linux exists in the first place. And Dell, for that matter.) At the same time, the existence (and continuing development) of WINE means that the software we use today can still be used tomorrow. There will come a point in the development of Windows where shared resources today's programs depend on are deprecated and ultimately removed. To a degree, this has to happen -- as cruft builds up in operating systems they become less and less stable. You have to wipe out old code to make the new code work well.

Of course, when your old software won't run on your new replacement computer, you have to replace your software. Which Microsoft sells. It's the flip side of new software not working on old operating systems, both for the practical reasons (it would be at the least a technical challenge to make Office 2007 run on Windows 3.11 for Workgroups) and for obvious financial ones. Which is why Microsoft is trying so hard to drive a stake into Windows XP now -- they want people to run Vista, because they want to sell Vista, and if they sell just enough Vista then they can release software that requires Vista. And in a few more revisions down the line, they'll quietly stop supporting software that today requires XP, and then you'll need to upgrade that too.

That sounds sinister, but it's not. It's the necessary business model for a company that made its fortune selling operating systems (the most boring part of the computer) and office productivity software. For a while, they could add features and people would upgrade. These days, more and more people are content to hold off on upgrading their OS and their programs until they actually have to. So, it's in Microsoft's financial interest to ensure that people have to.

Before I sound too much like a fruitbat, let us make no bones about it: this is common. Remember, I'm using a Macintosh, and have been for years. And for a couple of decades, we had the Macintosh Operating System, all the way up to Mac OS 9, and a monumental library of software written for it. And when Mac OS X came out, Apple had to spend a lot of time, money and effort developing a very WINE like software set called Classic that let people run Mac OS 9 compatible software on their Mac OS X machines. They also developed a coding environment and API called Carbon that let people develop software for both Mac OS 9 and Mac OS X in one go.

I work at a school, for the record. One that has used Macintoshes since the early 90's. One that developed a monumental amount of curriculum that ran on Mac OS 9 and before, using programs by companies that had themselves moved on after time. We used a ton of Classic on our Macintoshes.

And then we had the Intel Macs, and Classic went away. Now, if we wanted to continue to use that software and that curriculum, we would need to do it on older, PowerPC based machines. At least, as long as the hardware survived. So all's good, right? Well, sure... only we're continuing to develop curriculum on more recent systems and with more recent software, which needs more recent operating systems... and Classic has gone away as of Mac 10.5 Leopard, so even PowerPC based Macs running Leopard can't use the old software at all. And, it looks very likely that Mac OS X 10.6 (Snow Leopard) won't operate on PowerPC systems period. Gosh, that's stunning.

Naturally, WINE and the various other projects (including some Macintosh projects) mean that newer machines can still end up running older software. And, y'know. Blah blah blah open source blah blah monopolies blah evil. You've heard it before.

And besides, who are we kidding? I don't have Crossover because it sticks it to the man -- for Christ's sake, I own a legitimate copy of XP for this machine. And I don't have it because I'm scared that one day I won't be able to run City of Heroes. If the client stops working on current operating systems sometime in the future, it seems unlikely the servers will still be up in the first place. I have Crossover because it's way more convenient to click a dock icon and have it launch without a full boot cycle when I need to run configurations, and I have Crossover Games because I was sick of rebooting into Windows to play City of Heroes. It's not high minded and it's not protest-driven -- it's convenient.

And that's pretty damn cool.

Completely random Necropost.

| 24 Comments

Hey all. This is random because I'm up to my neck right at the moment.

However, for the record? Apple Premium Repair Dispatch has unexpected hold music. Wakefield's "Say You Will" just passed by, and now it's Tori Amos's "A Sorta Fairytale." Which makes for an odd state of mind while you try to find out if your onsite repair service tech is stuck in an ice storm or not.

In Apple's defense, it's a very nice cell phone.

You have to understand. I'm a long standing Apple fan. My big graduation present from high school, back in the mists of time before most of you were born, was an Apple IIc with monitor and printer. One of my first purchases in Seattle was a behemoth Macintosh IIvx that was surplused from Boeing. Later, I upgraded to a Duo 230 with DuoDock (man, I loved that combination). The first major purchase I made when I established myself as middle class was one of the last generations of the Pre-Mac OS X Macintoshes, the Power Macintosh 8600. (A computer still in nominal use today, I would add.) At my day job, I sysadmin for Macintoshes. I've been a part of the purchasing decisions for the school, and had a significant role in close to four million dollars worth of Macintoshes and other Apple products over the past decade.

And MacWorld Expo is one of those wonderful times of year to be a Mac user. We get our Brent Sienna on -- we go all pretentious and excited, and we tell the world about the exciting world we live in that you too can be a part of. And the centerpiece of MacWorld Expo is the Steve Jobs Keynote, where he comes out onto the stage in his sweater, lights gleaming off his receding hairline, and proceeds to redefine reality with the power of a Balseraph and the conviction of a Preacher who sells used cars on the side. It's fun.

And so we came to this year's MacWorld Expo. And this year's Keynote. Coming off of a banner Apple year, no less, with a lot of excitement in the air. There's a new operating system coming out. There's Core Duo 2 computers. There's things, and we're full on ready to grab hold of them. And we were waiting for Brother Steve to come out and show us the promised land.

Well, we have seen the land of milk and honey now. Only I can't say that the milk is healthy for drinking and the honey would trigger my dumping syndrome, and I'm feeling at best some Christmas Let-Down.

It's not that the previewed products are bad. They're not. They're solid pieces of engineering. They're exciting. They're well designed. In short, they're Apple products.

They're just not products... well, for me. Or, for that matter, for most of the Apple faithful.

There was the usual "here's how much better business has been" gloating, and the obligatory Microsoft mocking (including yet another Mac vs. PC commercial -- which continues the odd but moderately delightful casting of the Macintosh as the somewhat staid straight man and the brilliant John Hodgman getting all the laughs as the PC. Frankly, the Mac's a better computer but I'd rather spend time with the PC.) And then we actually got to the new product announcements. The charting of the course for the year.

That course opened with the Apple TV -- a box that looks like a very thin Mac Mini. The device is designed for WiFi or network access, and it allows full on synchronizing with a Macintosh and streaming from up to five others. It then feeds that signal at 720 dpi into a widescreen television, letting you take all the video you suck down from the iTunes store and otherwise get it into iTunes and watch it on... well, your television.

And it looks good. That much is very, very true.

But... it requires component video or HDMI out, and a widescreen television to use. And... it has a 40 gig hard drive, which is smaller than my iPod Video currently has. The iPod Video I can put on a dock and watch on the television I already own, rather than necessitating me buying a new television.

Which doesn't make the Apple TV a bad product. It's not. It's really slick. But it's nothing that'll be in my life any time soon. For three hundred bucks I could get some pretty staggeringly cool video components for my current setup. And if I did get a new HD television, that money would probably go a lot farther towards grabbing a full PVR for it, instead of an interface for the more limited selection of video in my iTunes folder.

(I actually have a ton of video in my iTunes folder, but a plurality of it came from my Tivo, which means it's not high definition in the first place.)

But fine. A cool thing I can't use is still a cool thing, and it was clearly setting the stage for something amazingly cool.

Really.

In Apple's defense, it's a very nice cell phone.

It's called the iPhone, and it's been rumored approximately as long as there has been Apple and Cell Phones. It is a full on next generation Smartphone, which looks as easy to use as Apple products usually are. It has monumental integration with contact information, it's widescreen with a massively cool touchscreen interface -- it's absolutely the next generation of these things, and at four or eight gigabytes of storage--

Um...

Well, it'll replace your Nano, dagnabbit! And it's gorgeous and exciting, just plain working and blowing the socks off of any other phone in the room. Which is good, because it's as expensive as any phone in the room, with a two year commitment. But it deserves to be. Seriously -- this thing is just astounding.

But, it's exclusively on Cingular, and Cingular doesn't work all that well in these here parts, and I'm not going to pay that much money for something that might not work all that well for me. If I were in the big city, I'd think a lot harder about it -- it's that much the sex -- but right now it wouldn't make sense at a fifth the price, and I'm sure it wouldn't work financially for that amount of money.

Even if they worked well in my area, that is a lot of money, and while I have an iPod Video and a cell phone and a PDA, and this wouldn't cost as much as all three of those did... I already have an iPod Video, a cell phone and a PDA, and they're not going to give me my money back for those.

Okay. So there were two cool things -- and an intimation that Google and Apple were getting really cozy together, these days, and an announcement of Paramount coming to the iTunes store, which... um... well, cool, I guess. And then they had a musical number... but it was okay. They hadn't done "One More Thing." There was always "One More Thing" and it would blow everyone's socks off. Maybe it would be Leopard related, or a MacBook Tablet (though the new third party ModBook is poised to come out at least until the cease and desist). Or something.

But there wasn't. There wasn't one more thing. Except an annoucement, that Apple Computers was becoming Apple Incorporated. After all, they sold digital music, and music players, and phones, and consumer electronics. It doesn't make sense to call themselves a computer company any more.

And... that was it. A thing for the television, and a cell phone. No computer announcements. No Leopard update. No software update. No announcement that the Intel Adobe Creative Suite was about to come out....

...and here we were. At MacWorld Expo (not AppleWorld Expo), we had a couple of really cool consumer electronics announcements, and a musical number. The tone for the year has been set, and it ain't the Macintosh.

But in Apple's defense, it's a very nice cell phone.

This is like a post, only it's not.

| 34 Comments

Every so often, I try to put into words just how Wikipedia has taken its mind numbingly huge potential and somehow managed to squander it. I do this in good faith, and also try to explain why it is I constantly use Wikipedia even though I think Wikipedia has wasted said potential.

(The answer to the latter is simple, for the record. Wikipedia makes a phenomenally good starting point for a journey. It just makes for a terrible destination.)

Anyhow, the most brilliant man on Earth, Lore Sjöberg, has managed to explain it vastly better than I ever could.

And been funny all at the same time.

In other news, I am recovering from the truly excellent run of the play, by rereading the complete works of Jeffrey Rowland. I'm into 2001 of When I Grow Up. It remains significantly better than many things that today I consider good, and yet Rowland considers it one of his weaker works. I take this to highlight the true and honest brilliance of Jeffrey Rowland, who is no Lore Sjöberg, but he does his best. And besides, who is Lore Sjöberg. Other, of course, than Lore Sjöberg. That old Legion of Super Heroes intelligence scale, which Brainiac 5 was a "12" on? Lore Sjöberg is a 20. In fact, the scale is called the "Sjöberg" scale and originally, Sjöberg was defined as a "1" and everyone else was defined thusly. However, it got depressing for people to be described as .02 intellects, so they finally multiplied everything by twenty and rounded to the nearest integer, so that people would feel better about all of it.

It was, of course, Lore Sjöberg's idea.

Logo: Sleeping Snarky

Other Recent Entries

We were unable to solicit a comment from Bill Gates, as he was too busy covering his naked body in treacle and rolling in giant piles of money. So, you know, "Thursday."
Well. I knew Hell was in danger of freezing over. I just didn't expect it to happen so soon.…
The quote is actually a quote, for the record. I didn't make the quote up. I wouldn't do that with a quote.
So, we're in the process of waiting for doom (doooooooooooom!) at my place of employ. (We switch our ISP…
Okay, here's something I do care about, with the move to Mac Intel
I mentioned earlier that I didn't care if they switched the Macintosh from PowerPC to Windows. And for the most…
This is like a major change in all the way we do things electronically, only that it's not.
I've been getting a lot of e-mails from folks wondering about my reaction to Macintosh computers switching to Intel based…
Somehow, the fact that it was written by Germans just makes sense.
So, having had to deal with the end of Pages, I have gone seeking a balm. Something to wash away…