nu's not unix

vnu

Computers as bling

posted on Feb 23, 2011
vaio
One of the weirder memes out there is that Macs are only good as fashion accessories, the implication being that people who buy them have more money than sense (or alternatively more fashion sense than common sense).

The meme is especially weird because Macs only come in 1.5 colours--white (is that even a colour?) and unpainted (I'm pretty sure that's not a colour). Their stylistic minimalism means there's really not much there to accessorize with. They look like either a large pad of lard, or an unfinished ingot of metal. As bling goes, they kinda suck.

Over at Sony, they understand how to build computers as fashion accessories.  The Vaio Fall Collection (this is no joke) are available in black, gold, glossy carbon, bordeaux red, sangria red, striped, wavy black, wavy white, arabesque black, arabesque gold, crocodile black, and crocodile pink. CROCODILE. FRACKING. PINK.

The question, then is this: does Sony actually believe this goofy meme, and are they trying to out-do Apple at compu-bling? (They are certainly succeeding!) Or is this meme just another bizarre fantasy of computer geeks who are not known for their fashion sense, and Sony is actually doing the opposite and distinguishing themselves as the anti-Apple?

Cathode

posted on Jan 26, 2011
IMG_0037
This "vintage terminal emulator" makes me inexplicably happy.

I switch all my terminal apps to green text on black backgrounds for that old-time VT100 effect, not just because it's nostalgic, but because it makes it much easier to pick out my terminals from my text editors and various other black-text-on-white-background windows. But Cathode goes the extra mile, with special "features" like:
  • highly curved monitor glass (with reflections!) with a fat monitor bevel
  • slow-fade phosphors
  • thick raster lines
  • authentic "fonts"
  • optional amber phosphors for that high-end hardware effect ;-)
  • faint refresh scanlines
  • adjustable jitter and noise... these are "enhanced" on the free version ;-)
  • adjustable ambient light
  • adjustable baud rate for an authentic coupled modem look-and-feel
Yes, I actually used to do real work on machines exactly like this. Someone went to a lot of effort to reproduce the complete experience. The only thing missing is the 20 pound keyboard with the shiny black typewriter keys.

Netcraft confirms it: Desktop Linux is Dead

posted on Oct 20, 2010

Okay, not Netcraft, but recently people have been reluctantly admitting that desktop Linux is dead. I think it took the tablet wars to finally wake them up to the fact that the day of the desktop is done. The real energy in the industry is pushing in other directions. And since Linux didn't win the desktop war, it must have lost it.

The thing nobody in the Linux community wants to admit is that desktop Linux died in 1997. That was about when RedHat bundled FVWM95 as the default window manager in RedHat 5. Before that, Linux was targeting the high-end unix workstation market, to devastating effect. Not a single major unix vendor survived that era with its value intact. Only Sun really survived at all, and limped on for a while before being bought out by Oracle.

Linux was a ferocious unix-killer because it provided 90% of the value for 10% of the cost. Its desktop was a unix workstation desktop--large, not much going on cosmetically, but versatile and highly functional for professional, technical users. Professional workstations used to start at about $15-$20K. Linux let you pick one up for a couple thousand brand new, and less than that if you were recycling an old PC.

What RedHat and the FVWM95 travesty did was definitively re-orient Linux from a unix killer to a wanna-be Windows PC killer. The target was no longer professional, technical users, but discount-conscious consumers. But now they were offering 90% of the value of a Windows PC at 90% of the cost of the PC. This was hardly the value equation that allowed Linux to utterly destroy the unix workstation market. It was hardly a value equation at all, and Linux advocates found themselves having to differentiate themselves using esoteric ideas about software freedom, which casual computers users don't understand and really don't want to spend time thinking about.

What is more, most PC users don't really know how to use computers, which is why the Windows UI is such a cocked-up dogs' breakfast of broken metaphors and poorly-implemented rip-offs of other companies' ideas. Half of the bad design decisions in Windows can be traced to the fact that early versions of Windows were run on low-resolution 14" monitors, which isn't big enough to support a proper desktop metaphor. Start buttons, taskbars, maximized windows, monolithic apps, and poor drag-and-drop support, all of these Windows disasters stem from the fact that a person cannot really multitask on a small monitor because you can't see two things at once and still work on them both. (Same problem you have on smartphones today, really.) But since unix workstations typically used high-res 17-21" monitors in the 1990s, they did not have such problems. 

Re-orienting Linux to target Windows users was a colossal strategic blunder. It squandered much of the goodwill previously gained in the professional ranks, and moved major parts of the operating system and user interface backwards in an attempt to appeal to consumers who were accustomed to working on crappy machines. And somehow, Linux advocates imagined that these users, accustomed to substandard computing experiences, would embrace a clone substandard computing experience that didn't even have the proprietary software that they had learned at a great expense of personal time and frustration. FVWM95 was the canary in the coal mine.

The final nail in the coffin was when Apple released a full-blown unix OS that attempted to better Windows, not ape it, and which did support the major proprietary software packages. That was 10 years ago. The last shovelful of dirt on the grave came with OS X on Intel processors, which basically meant that you could run Linux apps natively and Windows apps in fast virtualization or in multiboot, meaning that it was the ultimate compatibility machine. That was 5 years ago. Desktop Linux has been dead a long time.

A superior long-term strategy would have been to differentiate Linux from Windows (and OS X, for that matter) by taking up the mantle of the high-end unix workstation, and presenting Linux as the elite computing platform, the true successor to SGI and Sun. In hindsight it would not have changed the actual marketshare equation much, and it would have produced a superior computing platform that would be ideally poised to succeed in the post-tablet world.

Why would an elite desktop platform succeed where desktops in general are dying? Because desktops are dying in the 90% of the market that doesn't give a crap. Desktops like Windows or Gnome or Aqua are not a feature for these people--they are an annoyance that must be tolerated to get something else done. These people will migrate to iPads and the like, and the desktop marketplace will increasingly be dominated by professionals. Just like it was back in the 1990s, when Linux was destroying its competitors.

I probably spend more time in Linux than in OS X, and the current releases are pretty damn good for what they are trying to be. But what they are trying to be isn't relevant anymore. Watching Linux trying to master the desktop is like watching Microsoft trying to produce a mobile OS called "Windows". You can only shake your head and hope that one day they will join the rest of us in this decade.

But here's the thing: the death of the Linux desktop might very well be the best thing that ever happened to the Linux desktop. The iPad has shown us the future of popular computing, and it is utterly unlike the old desktop. Doubtless the Linux clone crews are already hard at work ripping it off, using Android or ChromeOS or Ubuntu Touch or some other Linux-based follow-on, and that's fine. Because if they can bleed off the Linux advocates' urge to appeal to non-technical consumers, the Linux desktop might be able to get it back to its roots as a professional workstation. And independent computing might be able to get back to its proper role of advancing the state of the art, instead of retarding it.

Open mobile OSes

posted on Oct 20, 2010
Andy Rubin responds in a single tweet to Steve Jobs' gibe that Android's openness is disingenuous:

the definition of open: "mkdir android ; cd android ; repo init -u git://android.git.kernel.org/platform/manifest.git ; repo sync ; make"

He's right of course, except for one small point: can the phone owner do this, or only the phone company? Because if it's the latter (and it is, for some phone models), then Android is the opposite of open, from a consumer perspective. It's really the particular phone model that is open or not, not Android itself.

Utilimalware

posted on Aug 3, 2010
There's a lot of chatter out there about the fact that you can jailbreak your iPhone4 simply by visiting the website jailbreakme.com (not linked, in case you are reading this from an iPhone), which uses a PDF exploit to jailbreak the phone. This was quickly followed by a blizzard of cognitive dissonance from the nerd masses, who like nothing better than to wax retarded about Apple.

One theme that arose quickly was that this was a boon for Apple lovers, because they could now do as they pleased with their phones. A few more sober thinkers questioned why any Apple lover would be pleased that their phones could now be rooted simply by visiting a website. As the realization dawned on a few of the slower thinkers that this meant iPhones were hackable in the worst possible way, a bunch more piled into the fray prattling on about how malware is just as much a problem for Apple as for Microsoft.

Aye, but there's the rub. Malware isn't actually the problem... yet. The most curious thing about this whole debacle, is that when these insecurities become zero-day exploits in the wild, the first thing hackers do with it is try to improve the iPhone for its owner. What other platform suffers from this breed of malware? 

Do not use this command

posted on Jun 23, 2010
From the hdparm man page:

This command is EXTREMELY DANGEROUS and HAS NEVER BEEN PROVEN TO WORK and will probably destroy both the drive and all data on it. DO NOT USE THIS COMMAND.

This command is so scary, they are probably afraid to even remove it from the system and delete the documentation for it.

The problem with web standards

posted on Jun 21, 2010

The W3C designed the web and HTML standards to accomplish a noble goal: the semantic organization and cross-indexing of human knowledge. Large parts of the HTML standard are oriented towards this somewhat academic goal: tags such as <cite>, <acronym>, and <address> have been around a long time, and yet 99% of web designers never use them. Why not? Because 99% of web sites don't give a damn about the W3C's ivory-tower objectives. They have a completely different objective, and the W3C objectives are actually an impediment to them. Fortunately (for them), HTML is flexible and tolerant enough that it is easily subverted to their own purposes.

The objective of most web sites and designers is to deliver digital media and services as cheaply and efficiently as possible. They don't care about (or even understand) the concept of the semantic web, and even if they did, they don't have the time, inclination, or sophistication to restructure their marketing materials to give them an appropriate semantic structure. (And if they did, the marketing department would undoubtedly complain about the unattractive consequences of it.) The perceived benefit to most website owners is near zero. And yet a substantial portion of their web designers have nevertheless bought into the line that strict adherence to W3C standards is somehow a Very Important Thing.

But why is it important? To give a specific example, why should we use HTML 4.01 Strict, instead of 4.01 Transitional? From a purely functional standpoint, Transitional is more tolerant, provides better support for legacy browsers. and for most pages offers no perceptible performance penalty. In any other field it would be considered a superior choice for the vast majority of cases, so why is it perceived as inferior by web geeks?

If you want to take the argument to the extreme case, why is XHTML considered superior to quirks mode? Why would anyone rationally choose a mark-up standard that is highly intolerant of errors, and requires extra pedantry in its mark-up, over one that is simple enough to be encoded by one's mother in Notepad, and which is supported to some extent by every browser ever made? Especially if their output is indistinguishable?

Web geeks have lots of answers to this. Unfortunately, most of them boil down to this: stricter standards make more sense to computer geeks. XHTML is awesome because it follows rules. Quirks mode sucks because it's full of ambiguity. (Similarly, 4.01 Strict has better rules than 4.01 Transitional, which tolerates older, sloppier rules.) But the computer geeks are wrong on this score for two reasons that most of them will refuse to acknowledge.

Firstly, tolerance of ambiguity is a Good Thing. It is a vital aspect of any system that attempts to be truly expressive. Computer geeks like to think that you can be as expressive as you want, as long as you follow the rules. (And since they understand the rules, you can always come to them if you need help, for a mere $75 per hour.) But in fact, expressiveness often requires breaking the rules. The early web used tables for doing layout because the early rules did not take advanced layout into consideration. The W3C conceived tables as a way of presenting structured data, not as a way for presenting entire documents. It is still necessary to use tables today to accomplish some layout tricks that modern HTML standards still don't support well. If we were required to follow the W3C's recommendations strictly, the web as we know it would have been stillborn. It would have remained stuck in its original role, as a cheap way to distribute academic papers and reports.

Secondly, the computer geek's respect for rules is not shared by the rest of the world. Overly strict rules are evidence of overly stupid computer programs that lack the intelligence to make reasonable decisions. But the web has evolved to the point where browsers are actually pretty clever. They can encounter all kinds of messed-up code, and still present useful information. So if our browsers have gotten so smart, why do we need to change our web documents to make them more computer-friendly? For most website owners, it stinks of gatekeeping, of a scam to make things more complicated so that expensive salaries and consulting fees are justifiable. If the web site looks okay and everything works, why do you need to spend thousands of dollars to convert everything over to a slightly different flavour of HTML?

Now, I happen to be a web geek, so I already know that there are good reasons why you might want to do this. But I also know that there are also good reasons why not. The mere fact that the latest flavours of HTML are "better" in some geeky way than the previous flavours is not a good reason to change. It's not even a good reason to choose the new flavour for future projects. Geeks tend to like new stuff as a matter of principle, so they might chafe at this suggestion. But unless the new flavour provides a specific feature or function that you really need, don't bother.

X Files

XHTML sounds cool and cutting edge, probably because 'X' is a hip marketing letter. In this case the X stands for XML, and in that acronym it stands for "extensible". So really it should be an E. But EHTML sounds kind of wishy-washy, doesn't it? We probably wouldn't be so eager to convert to EHTML. But XHTML sounds better. X-treme. X-ray vision. X-rated. We want it.

But XHTML is actually pretty lame. For reasons that have nothing to do with web productivity, and everything to do with inheriting arbitrary lexical rules from difficult markup systems like SGML, XHTML will not accept all kinds of perfectly legal HTML markup. So it's basically crippled compared to its predecessors. Seriously crippled, if ease of content manipulation is one of your requirements (and surely that's a major requirement for most web publishers). Furthermore, most XHTML documents are served up as HTML, not as XML, so so you aren't even getting the the purported benefits of that X. Why the heck would anyone choose such a markup standard?

Well, there's one good reason. XML allows you to unambiguously define your DOM (document object model). That's a pretty important feature, in some applications. Repeat: in *some* applications. Is your application one of those? (Hint: if you're not sure, then it isn't.) If not, then by choosing XHTML you have gained a feature you will never use, at the expense of features that are important and useful for maintaining regular websites. If very precise DOM parsing or manipulation is mission-critical for you, then XHTML is the ticket. Otherwise, it may very well be an albatross.

And there's the rub. Web standards exist to make *some* things easier. It's important to know what those things are before you jump on one bandwagon or another. In other words, what are the web standards trying to accomplish, and is it the same as what you are trying to accomplish? Do you even know what the web standards are trying to do? Bearing in mind that the W3C's vision of the web does not match the vision of the vast majority of website owners, odds are that the web standards are only tangentially related to your own objectives.

Alternatives

An illustrative example of how the W3C's vision of the web diverges from the reality can be found in its treatment of images. The modern web is rich with graphics, but from the perspective of the web's big thinkers, this is a problem. Images are really hard for computers to understand, and this creates a pretty big chink in the armour of the semantic web. As more of the web's content is represented by images, the web becomes increasingly opaque and difficult to comprehend, except by direct application of eyeballs to screens. It's more than just an academic problem, because some web surfers are visually impaired and even the eyeballs-to-screens method of understanding images does not work. And yet images are undeniably important, even in the original academic conception of the web, where they were used for graphs and figures.

It is such a serious problem that it was dealt with some time back by the simple expedient of requiring web designers to tag their images with an ALT attribute, giving a brief text description of the image. This is a simple and effective solution. The W3C validator will find all of your images that are not appropriately described, and flag them with a big error message that says you must provide an ALT tag. Unthinking web designers obediently add descriptions of their images, the errors go away, and everybody is happy that their pages are now "certified" good quality by the W3C validator, allowing you to add their badge of approval to the page.

Except they're not good quality. The vast majority of images on a modern web page are cosmetic, and serve no semantic purpose. They are fluff, meant to instill good feelings in the viewer, or provide a bit of visual assistance in delineating parts of the page. Some are not even visible, being spacer graphics or struts that push stuff around for a more pleasing layout. In the W3C conception, images are (or ought to be) real content. In the real world's conception, most images are lipstick for the pig. When these two world views collide, you get web pages with dozens of spacer images all marked up with "spacer image" ALT attributes, and various cosmetic feel-good images with little semantic value marked up with informative alt tags like "image" or "photo". You also get senseless repetition, as cosmetic images carry ALT tags that simply repeat something that is already stated in regular text. This is not what the W3C was hoping for. This is not the intent of the standard. But it's exactly what blind adherence to standards has produced.

Considering how images are actually used in the real world, they should not be required to carry an ALT attribute. The W3C's assumption is that images are content, when in fact they often merely style. CSS allows you to push a lot of this stuff out to your stylesheet, but in practice this does not eliminate the problem, only keeps it down to a dull roar. Once you understand the W3C's intent, you can work around their faulty assumptions in the real world by setting ALT="" for cosmetic images. This effectively expresses that the semantic value of the image is nothing. But this is a silly thing to require, since it should obviously be the default, and it just adds a bunch of unnecessary data to the page, and unnecessary work for the developer.

Web purists might now be thinking, "Put your cosmetic images in CSS, where they belong." For example, say you want to pop a big splashy stock photo in the heart of your page, but you know in your heart that it has zero semantic value. If the image is more style than substance, perhaps it does belong in CSS. So we create an empty DIV in the heart of our page, and then we modify our stylesheet to give this DIV appropriate dimensions, and a background image (which is our stock photo, of course), and this appears to get the job done. Viewers of the page see the stock photo in all its glory, but in the document markup, it is nowhere to be found. As far as the W3C is concerned, the photo is semantically irrelevant, which is precisely the case.

But this supposed solution is in many ways worse than a regular image with blank ALT text. For starters, it's a hack, no matter how pure it seems to CSS fans. You are using an attribute explicitly intended for backgrounds to render an element that for all intents and purposes is a foreground image. It's also a content management nightmare. Instead of simply placing an image tag where the image goes, you have to place a much less intuitive DIV tag and give it a unique ID. Then you have to edit your stylesheet (which is written in an entirely different language) to specify the exact dimensions of this DIV, and also specify the URL to the image. This will be well beyond the capabilities of many casual web authors whose understanding of HTML and CSS does not go beyond the WYSIWYG HTML editor that they interact with.

Using your Head

Another very common mistake by developers is using headings inappropriately. HTML provides a convenient set of 6 headings, H1 through H6, to help you mark up your documents. But most web authors use these quite inappropriately.

Because most style sheets scale the size of headings down as you increase the heading number, it is widely assumed that heading number it is just a convenient shorthand for visual prominence. But in fact, the heading number corresponds to the document subsection. An H1 begins a top-level section, an H2 begins a sub-section (section 1.1 of the document, for instance), and H3 begins a sub-sub-section (1.1.1), and so on. This means that heading numbers are actually a powerful semantic tool for defining the structure of large, complex documents.

That makes perfect sense when you understand what the W3C is all about. But once again, it's nearly useless when you see how the tags are actually used in the real world. Very few sites even conceive of their pages as having sections and subsections, much less use their heading tags to delineate these correctly. Few sites publish documents long and complex enough to make effective use of heading levels, and the majority of those that do, prefer to paginate the document so that it's broken into easy-reading chunks with more advertising opportunities. Needless to say, the pagination does not conform to W3C expectations about section structure.

Validation

Headings and images are good examples of how we're all using HTML incorrectly and contrary to the W3C's intent. And these are basic tags; it doesn't get much more fundamental than these. What hope is there that will we make proper use of the more interesting semantic tags, such as CITE, ACRONYM, ABBR, and DFN? Fortunately, the question is moot, because we don't really use those tags at all. They just go over most of our heads, which alone should convince us that we may not be using the web the way the W3C intended.

And yet, despite the fact that we're basically doing it all wrong, we still have a mad desire to get the W3C to validate our pages. The W3C even encourages us to do so, by offering cute little merit badges that we can place on our pages if they pass their validator.

Since the validator only checks syntax, not semantics, the big question that nobody ever seems to ask is: who cares if your syntax is correct if your semantics are incorrect? In other words, why are we obsessing about our spelling, when our sentences are gibberish?

The Purpose of Standards

Ask an average techie about the importance of web standards, and you will probably get an answer that talks about compatibility and interoperability. Standards allow you to maintain a working relationship with a range of vendors, operating systems, browsers, legacy software, future software not yet conceived, and with alternative devices for the disabled, without ever needing to test directly against those devices.

The trouble is, this answer is wrong. It is correct for most fields of software, which is why we think it is right. But we're talking about the web, and the web is different. The HTML specification is forwards- and backwards-compatible by design, and all the more so in quirks mode. Unknown tags and attributes are simply ignored, while their content is not. The compatibility is implicit; you don't gain it by declaring a hard spec to adhere to. If anything, forcing the use agent into a hard spec should reduce interoperability, since some agents won't support the spec.

The test of this is simple; invent a new tag that you imagine (wrongly) will be in a future HTML 6.x specification, and go ahead and insert it into all of your web pages in anticipation of this happy day. For example:

The "wow" tag makes everything inside it <wow>more exciting</wow>!

Not only do your pages not break, the content inside your imaginary tag still renders in a reasonable way. And there isn't even a detour into quirks mode. So the compatibility rationale is basically bogus.

There is a a slight theoretical compatibility benefit to declaring a specific standard, which is to announce your intention for how ambiguous or evolving elements should be understood in the context of your document. For example, say you have documents containing old HTML 3 markup like <FONT> tags, which you don't want to support anymore, but nor do you want to edit all of those files simply to remove the offending tags. It would be nice to just redeclare the documents as HTML 4.01 Strict, which does not support FONT tags. Since unsupported tags are ignored (as in our test above), the FONT elements should effectively disappear from our documents, right? Well, no. This doesn't actually work. Browsers will actually implement the FONT tag that does not exist in your declared specification, by borrowing silently from another specification that does support the tag. And they will claim to do this while remaining in "standards compliance mode".

So much for standards.

The Real Purpose of Standards

It's not that standards are poorly supported on the web (although arguably that is the case). The real issue here is that most of us don't understand what purpose the standards are really serving, which is machine readability. The architects of the web want machines to be able to comprehend what they are reading--not in the sense that you and I comprehend it, but in the sense that they could index and search it better, and do Wolfram Alpha-like information processing and distilling for us, rather than just show us a bunch of links with better-than-average chance of being relevant to the search terms we typed in.

Of course, since we content creators haven't actually written the web to allow for this, and show no real inclination to start despite our misdirected obsession with web standards, the machine readers really don't have much to work with. And that's why Google is still showing us a bunch of not-quite random links, and why the first attempt at a true knowledge engine, the aforementioned Wolfram Alpha, ended up not being a web search at all.

So should we care about our HTML standards as much as we think we should? Lets just imagine for the sake of argument that we have a web document that is semantically marked up to convey maximum meaning in exactly the way the W3C intends. And then we run it through the validator to ensure that the markup is syntactically perfect. What have we gained from this exercise, practically speaking?

Hard to say, really. We wouldn't really know until enough people did it and it started to become possible to build semantic search engines to make use of all that latent potential. But nobody really knows when that latent potential will attain critical mass, and until it does there is a large amount of effort to play along that has no immediate tangible benefit. And that's just considering the semantic tags already available to us in lil' old HTML. When you take a look at the other markup technologies that the semantic web gurus want to throw into the mix, it's positively monstrous.

Which means, if you work in the real world where people don't like to spend their money (in other words your time) without an immediate payoff, it's just not going to happen. The web will continue to be an organic, unpredictable medium that fails to do what it's engineers want it to. But fortunately, it will continue to surprises us by doing things we never expected it to.

Brand X

posted on Jun 10, 2010

I know we're supposed to pronounce OS X as "O-S-Ten", but I refuse. The official pronunciation was an old marketing gambit by Apple, an attempt to convince customers that OS X was the next incremental improvement on Mac OS System 9. It worked for the most part, but years have since passed, and it's time to give up the pretense that OS X was an incremental update to System 9.

"OS X" was actually a pretty clever name for Apple's once-new OS, not just because of the sequential serendipity of the Roman numeral 10, but because X was also a reference (to those in the know) to the unix at the heart of the system.

For reasons that have to do with 1980s-era trademark disputes that are so complicated that they are still being fought about today, most companies producing operating systems based on unix technology forsook the Unix brand, and adopted another name with an "X" in it. This is how the world ended up with not just SysV Unix, but also AIX, A/UX (Apple's own), Ultrix, Irix, Xenix, HP-UX, Linux, and many others. The attempt to standardize methods across all of these systems was itself called POSIX. The cross-platform unix windowing system was simply 'X' (although, like OS X, there is a double-pun here since it's non-unix predecessor was called W). This naming scheme was a subtle and trademark-free form of co-branding, and few unix companies elected to forego this little trick. (Sun was one of the few to succeed with SunOS/Solaris.)

NeXT was another in the long line of X-brand unix OSes, and in the unix tradition made liberal use of puns in its obscure nomenclature. Aside from the 'NeXT' name itself, the system kernel was called XNU. This is just UNIX spelled backwards. Well, almost: XINU was already taken, but XNU was more or less the same acronym: X(i)NU is Not Unix.

Anyway, XNU is still the kernel of OS X, which is basically a rebranded NeXTstep, so it's pretty clear to the hard-core nerds what the X in OS X really refers to here. So some of us have long ignored the official pronunciation guide and deliberately said "O-S-Ex" even after being corrected by people who read too much Apple PR.

Apple's attempt to rewrite its own history has always had one really clumsy side-effect. What to call new releases of "O-S-Ten"? The obvious choice would seem to be "O-S-Eleven" etc., but clearly Apple has avoided this, giving the lie to the whole "System 10" myth. They really can't decide whether X is a version number or an X-brand unix reference. In a misdirected attempt to clarify things, they explicitly added version numbers 10.* after the X. So we're supposed to say "O-S-ten version ten? For a company that prides itself on good design, this is surely one of the goofiest marketing decisions ever. It's not only redundant, but it's also redundant. Will there ever be an "O-S-ten version eleven"? Or should we expect one day, not too far from now, an "O-S-ten version ten point ten"? Did I mention redundant?

Plus it leads to pointless confusion about paying for minor-version-number upgrades every time a new release of OS X comes out. Apple's even goofier way of addressing this is not to adopt a sensible version numbering scheme, but to name the versions after cats. But I for one can never remember whether panther beats tiger, or if it's the other way around.

Clearly one part of the marketing department was enamoured of the false "System 10" mythology, while another part of the marketing department thinks OS X can stand on its own two feet as an X-brand unix without a nod to obsolete product lines that nobody cares about any more. But the series of compromises that these two groups have come up with to support their different visions of the OS X brand are just plain dumb, and they get dumber with every year that System 9 has been dead and buried. Can we let it go already?

The NeXT big thing

posted on Jun 6, 2010

People tend to assume I'm an "Apple guy" when they see my laptop. Sometimes they even ask, in which case I'll try to correct them. I'm actually a unix [1] guy. But most people don't understand what that is, or where to put me on the tired old Mac/PC spectrum. Here's an informative diagram that might help [2]:

Dilbert.com

My wife hates it when I grow a beard, and I don't wear suspenders, so I really only have the smug expression and condescending attitude to go on. I suppose that makes it understandable that I could be mistaken for an insufferable Mac fanboy [3], but it's purely coincidental. I'm often annoyed by the Apple Way. I haven't really been an Apple fan since the Apple ][, and up until about 10 years ago I was in general agreement with the Mac haters, that Macs were overpriced, technically substandard toys.

But then something happened. As Apple Computer was entering its death spiral, there was a coup, and a little outfit called NeXT snuck in and took over the company. This was a reverse takeover in the sense that Apple technically bought NeXT, but NeXT's management and technology took over Apple. Whatever the financial details, the practical result was that a cutting-edge but underperforming Unix shop took over a failing multi-billion dollar consumer electronics company, aggressively cleaned house, and proceeded to take over the world.

Few people understand the nature of the revolution that occurred with NeXT's takeover of Apple, because although NeXT dominated the future of the company, they quietly dropped the name and instead ran with the more recognizable Apple Macintosh name. This was pure branding sleight-of-hand. In fact, the Mac as the world knew it was dead. System 9 was the end of the road for the platform that Steve Jobs had launched over 15 years before.

OS X was supposedly System 10—you were even supposed to pronounce it "O-S-Ten" to support Apple's mythologizing. But in fact, 10 was not an incremental improvement over 9. It was a different species. Nobody wanted to mention it in anything but whispers, but the Mac was a mouldering corpse. Steve himself murdered it with his own hands, and then he peeled off its bloodied robes and wrapped them around his princeling, his beloved NeXT, dialed up the RDF to max, and announced the new Mac to the world. Long live the king!

And the world fell for it. The coup was a success.

The only thing really Mac-like about the new system was the fact that it came with an emulator and translation layer to run legacy Mac applications, and an API (Carbon) so that old-school developers weren't cast to the wolves. But other than that, this new machine was essentially the NeXT ][. It wasn't nearly as new as Apple wanted you to believe. The Cocoa API was basically NeXTstep 5 [4], and NeXTstep had been around since the 1980s, almost as long at the original Macintosh. [5]

I was never a NeXT user. I had become a unix guy when I was working in academia and research, and NeXT was always trying to sell into academia because that's where all the unix guys were, and they figured we would just get it. And yeah, we got it: they were sweet, slick little machines. If you could afford one, you could do nifty things like invent the World Wide Web. But frankly, they were underpowered as workstations and expensive as desktop PCs. My shop needed horsepower more than it needed slick user interfaces, so we would do weird things like buy big SGI 4D and Onyx systems, designed for cutting-edge graphics rendering, decline the graphics cards, and just hook up a 15-year-old VT100 terminal to them. The SGI salesmen were confused by this, but at least they could make a sale because their stuff was fast. The NeXT guys couldn't get that far with us. Not with many labs, in fact. They sold only 50,000 machines, ever.

But damn could they put together a user interface. They did what was considered by many to be impossible, and made unix a genuine pleasure to use. And when they took over Apple, they pulled off an even more impossible feat, and convinced everyone that unix, the cryptic beard-and-suspenders operating system, really ought to be the first choice for your grandma's computer. Almost overnight Apple changed from a failing company with an outdated operating system, to the seller of the slickest, most sophisticated unix workstations that money could buy. And a for a decent price, compared to the old NeXTcube.

Not only did NeXT-cum-Apple become the world's top seller of unix workstations, but they timed it perfectly, just as all of NeXT's rivals in the heavyweight unix market were hitting the wall. SGI, king of the slick unix workstation market, was entering its own death spiral. Digital Equipment was bought out in 1998, a year after NeXT took over Apple. Sun fell on a decade of hard times after the tech bubble collapsed, and was finally sold to Oracle. Linux was undercutting everyone. NeXT itself had stopped making hardware a few years before. Just as the commercial unix world went into crisis, NeXT deftly exited the whole market and reinvented themselves under the Apple brand.

Apple's official history does not lay things out this way, because the official history is not the history of computers, but the history of the brands, and Apple and Macintosh are two of the biggest brands in the world. Branding is all about inventing a narrative that well help you sell things, not about telling the truth. So the world heard all about the exciting "new" things that the Macintosh could do, and not a word about the daring coup of a struggling 12-year old academic operating system that co-opted the Mac brand and used it as a trojan horse to insert itself into consumers' homes.

Virtually everyone in the tech press went along with Apple's branding narrative, because it was compelling. Underdog Mac gets up from the mat after a 9-count, and against all odds, starts punching his way to victory. Wow! Great stuff. The real story from the murky and fragmented unix world was way too confusing to get any traction. But because the press happily went (and still goes) along with what is really an imaginary narrative, there are a lot of things they get wrong.

So this blog might be about the things that the blogosphere gets wrong because they are too wrapped up in their own storytelling and mythmaking. And it may also be about the things that Apple gets wrong, as the mythmaker-in-chief busy constructing its own artificial reality that is regularly at odds with its actual history and technology. It will not be about the usual Apple buzz and hype, because there are already a thousand bloggers covering that more enthusiastically than I ever could. Mostly, it will just be about my experiences as a user of the best(-kept-secret) unix workstation money can buy. If it is ever smug and condescending, it will be in a beard-and-suspenders kind of way.

Notes

  1. I will use capital-U Unix to refer to the official Unix brand, and small-u unix as the actual operating system technology.
  2. Oddly, licensing a Dilbert strip for a blog with zero unique monthly visitors costs $100, but they encourage you to embed the strip for free.
  3. Never met one of this fabled breed, to be honest. They might be mythical.
  4. NeXTstep 3.3 was released in 1995, and NeXTstep 4.0 was in beta at the time of the takeover.
  5. The original Mac was released in 1984, Jobs was fired in 1985, and founded NeXT almost immediately. So it was basically the anti-Mac, the grudge-Mac, the pretender to the throne, from day 1.