nu's not unix

vnu

NeXT and the online revolution

posted on Oct 8, 2013

The Steve Jobs nobody knew, Rolling Stone magazine:

A year earlier, a hotshot programmer at the University of Illinois named Marc Andreessen had created the first Web browser, and the dot-com revolution was about to take off. There was a sense that something big was on the horizon -- something that Jobs seemed to have no part of ... nothing he was doing at NeXT was really connected to the online revolution.

Other than the fact that the World Wide Web was invented on NeXT computers, that is. A minor detail, I guess.

What's behind Microsoft's fall from dominance?

posted on Sep 10, 2013

Barry Ritholtz seems to be one of the few tech pundits who understand how Microsoft lost its mojo:

Microsoft had its deal with the devil: Its lightning in a bottle was not some awesome technology or brilliant breakthrough -- it was a clause in a contract that led to an enormously profitable monopoly...

Once the Justice Department and the European Commission found the company in violation of antitrust laws, it was forced to compete fairly. It is no coincidence that as the company lost its vice grip on the desktop, its dominance faded. Revealed as a dinosaur, it was unable to compete with the smaller, more-nimble mammals.

Nothing I haven't said before, but my opinions aren't published in the Washington Post ;-)

Do 90% of computer users hate learning?

posted on Mar 14, 2013

Matt Baxter-Reynolds claims that "90 percent of users just use technology to get a job done and have no tolerance for learning at all." (Well, he quotes Simple and Usable by Giles Colborn, but he agrees.)

But then he goes on to say: "This method, by the way, explains Apple quite precisely. Apple's products don't do much, but what they do do requires no cognitive load or expectation of understanding drawing from prior experience whatsoever." Apparently, Apple makes nipples. Or, Matt is a douche who has no idea that Macs are the everything machines, running Mac, Microsoft, and unix software all at the same time, which is more than any other computer can do, and that iOS is basically the same operating system on a pocket gadget.

Apple's products actually have a steep learning curve. The Apple ][, the original point-and-click GUI, the click wheel iPod, the multitouch iOS interface—virtually every major Apple product category has redefined user interfaces, and required users to completely re-learn how to interact with technology.

The difference is that Apple invests a lot of time finding better ways to do things. That means that learning how to use Apple products is a pleasure, especially when coming from an older generation of technology whose frustrations had become routine and expected. Once you figure out how something is done on an Apple device, there is a moment of surprise and relief. "Oh, that's going to be so much easier than how I used to do it."

The most illustrative example, in my opinion, is how you install apps on a Mac. These are unix applications, which traditionally have one of the most user-hostile installation processes in the entire history of computing. Windows is only slightly better, because it hides this nonsense behind an install applet. When people first arrive on a Mac and try to figure out how to install an app they have downloaded, they are stumped, because they are expecting some bullshit of a similar flavour, but there's no obvious way to launch or even find the expected installer program. When they figure out that you just drag the app to your apps folder or desktop, they are so relieved.

The pleasure is in the realization that this annoying and painful process that you had become accustomed to, is not in any way intrinsic to computing, and that someone has figured out a better way to do it. Learning this may have been difficult, but it is welcome.

But when we are forced to relearn something that we learned at great expense, but the new procedure has no obvious benefit to us, then we do get annoyed. Our internal bullshit detectors start beeping like crazy, and we suspect we are being jerked around for dishonest reasons. In the case of Windows 8, the subject of Matt Baxter-Reynold's article, those reasons have to do with Microsoft attempting to shore up their Windows-everywhere business model, rather than making the lives of their users easier. Resentment is the predictable outcome.

How the Mac got its mojo back

posted on Mar 5, 2013

Further to my last post, Mike Arrington has some similar thoughts about the success of the Mac being directly related to the Internet.

Arrington seems to think consumers went to the Mac simply because the playing field was levelled by the Internet. But there's more to it. Regular people don't just jump platforms because the playing field is "level". It took them a long time to learn how to use their computer. They won't change without a good recommendation. And who do you ask for a recommendation? The "computer guy" you know, who is probably a programmer or IT person for something or another.

Those "computer guys" (and gals, of course) were among the early adopters of Mac OS X because they are the types who are willing to jump platforms on their own. They are the ones who aggressively kept Windows XP up to date, upgraded OS X as soon as upgrades became available, and tried out new Linux distros for fun--everything that the casual computer user was not willing to do.

These folks were the first switchers, and found much to like. A superior environment for doing your work in, good compatibility with Microsoft file formats, and great compatibility with the unix boxes in the data centre, with all the killer commercial apps available. It was, in fact, the ultimate compatibility machine--the complete opposite of what a "closed" architecture should be like. Windows soon got relegated to the home desktop rig, running games, Linux got relegated to the server, and OS X found itself doing the day-to-day work because it was the versatile, plays-nice-with-everybody notebook that got lugged everywhere. But because of that portability, it was the Mac that people saw being used, and that put it into people's heads that it was an option for their next upgrade. If the computer guy uses it, then… And so they started asking about it.

In those early years, I had this conversation with Windows users, many times:

Them: I didn't know you were a Mac guy.
Me: I'm not, really, I only starting using one recently.
Them: How do you like it?
Me: It's great, no problems at all.
Them: I heard you don't have to worry about viruses.
Me: That's true, pretty much.
Them: I need a new computer, mine sucks. Do you think I should I get a Mac?
Me: Depends. Do you use anything that is Windows-only?
Them: Oh yeah, I use Office a lot. :-(
Me: Office is available for the Mac.
Them: Really?

Or, if I was talking to someone in the Linux world:

Them: I didn't know you were a Mac guy.
Me: I'm not, really. The new Macs are unix boxes.
Them: Wut?
Me: Yeah, full BSD, like a NeXT. See, here's my terminal.
Them: What!?
Me: Yeah, they come preloaded with emacs, gcc, perl, apache, X11, all kinds of stuff.
Them: They can run Linux software?!
Me: Most open source stuff will compile. See, here's Gimp. Here's LaTeX.
Them: Whoa...
Me: Plus, it runs Office and PhotoShop.
Them: Fuck.

Done, sold. The Mac faithful never really had these same conversations, because they had always used Macs, had always put up with the differences, and people saw them as iconoclasts with somewhat suspicious opinions. But when the computer guy's ThinkPad disappeared one day to be replaced with a PowerBook, that made you think and ask questions.

That's how Apple beat both Linux and Microsoft at "open computing" -- by being open enough to run everybody's software on a single platform, and do it well. Nobody else did that. They still don't, over 10 years later.

Open does beat closed

posted on Mar 1, 2013

John Gruber missed the simplest and most obvious rebuttal to Tim Wu's "open beats closed" thesis, and that is:

Apple is the largest vendor of open source operating systems in the world.

Like many people, possibly including Gruber, Wu is under the mistaken impression that Apple is down at the "closed" end of the scale. He fundamentally misunderstands how openness really works in the world of computing. Apple does understand however, because they are secretly a rebranded Unix shop.

It may not seem like a big deal to people who were among the Mac faithful the whole time, but the early "switchers" in the OS X 10.0 and 10.1 days were hugely important to driving early adoption and development on OS X. They were mostly unix and internet developers, for whom openness is vitally important.

The Mac was a superior platform for these developers because of its openness in the respects that mattered to them--compiler toolchains, libraries, POSIX, source compatibility with other Unixes, command line environments, internet protocols, etc. The Internet and web have always been dominated by "open" unix-based systems, and the Mac almost instantly became the Web and Internet development workstation of choice because of how nicely it fit into this world, right at the exact time when the Internet was exploding into everyday use.

The Mac was actually a spectacular success in Internet workstations (especially laptops) for the exact same reasons Linux was a success in the data centre. And when I say, "exact same", I mean exact: the same command line tools, the same programming languages, the same shell scripts, the same database environments, the same web server. It was like your Linux production environment was running right there on your laptop. Just rsync and your changes were live on the real server. Yeah, you could do that from a Linux workstation, too, but setting up a decent Linux laptop in 2003 was an exercise in pain and frustration, and even if you got it working, you still didn't have access to the world of commercial apps for Mac (and, after the Intel switch, Windows). It was the PowerBook that freed the unix power developer to go wild on the Internet.

Gruber is right, the consumer doesn't really care about openness. But developers do, and Internet developers especially. And where the developers are, the consumers are bound to follow, because that's where the cool stuff happens first. Microsoft proved that in the 1990s, and it was openness that drew developers to the PC, too.

Some might argue that the transition from web to mobile means that Apple's closed attitudes with iOS will become a problem. Maybe, but I don't think so. Mobile development is still Internet development by and large, just with a different front-end. The fundamental concerns of the developers are with communications, protocols, service architectures, and "the cloud", and iOS still has those unix roots that keeps it open in the ways that matter to developers. Android has unix underpinnings, too, of course, but Apple captured the cream of the Internet developer market 10 years ago, and Android is a latecomer. They either have to woo away Mac developers who have already made their choice, or they have to woo the ones who stayed behind with Windows or Linux on their workstations, and who enjoy programming in Java to boot. I suspect this factor explains everything you need to know about why Android has the "feel" that it does.

iOS maps SUCK!!!1! No, wait, that other thing...

posted on Sep 21, 2012
ios-map

I upgraded to iOS6 just to get the new maps app so I could see what all the bitching was about.

BTW, I'm a map geek. And I love Google Maps. So I was prepared for disappointment. But frankly the new iOS 6 maps are the bomb. I get it, their map data is young and immature, and there will be many, many bugs of the kind that can only really be shaken out with real world use. But I didn't actually see any of those, perhaps because I'm lucky enough to be downtown in a major city that has lots of up-to-date info available on it.

What I did see was an awesome 3-D satellite view (shown above) that could not only be panned, zoomed, and spun, but could do all of that automagically by enabling the compass to set the perspective on the map. It's basically bird's-eye augmented reality. That's hot.

I also tried the turn-by-turn directions, which is something I almost never use (having been blessed with a pretty reliable internal compass) but which I had heard bad things about. So I asked Siri for directions to the Cambie pub, to see how rough around the edges it really was. Siri couldn't find the "Candy" pub, but after I over-enunciated it, she figured it out, and gave me directions. It was a pretty simple task, since it was only 2 blocks away, but everything was pretty darned precise, down to the exact metre that I needed to turn left, and the ETA, which was spot-on. I thought Siri actually misjudged the location of the pub door, because I had to walk an extra 10 steps past it before she told me I had arrived. But then I noticed that I was right in front of the Hotel main entrance, which also serves as an entrance to the pub, so good enough, and arguably more correct.

As for the accuracy of the map details, I learned that the alley closest to my office has a name: Trounce Alley. This surprised me, because (A) I didn't know it had a name, and (B) the map showed that Trounce Alley actually extends back two blocks to a section known as Blood Alley. On Google Maps, the same 2-block stretch of alley is called Blood Alley Square. So who is right? Turns out the alley is officially named Trounce Alley, and Blood Alley Square is just a colloquial name for the far eastern end of the alley where it joins Carrall Street. So Apple (top) got it right, and Google Maps (below) gets it wrong:

Speaking of quality of data, the Google Maps above shows the Woodward building as a construction zone, which it hasn't been since 2009. The Apple Map shows a completed building, so it's data is not only more accurate, but more more current in this case.

My only complaint is that the compass is a bit finicky, complaining regularly of compass interference as if there were big magnets nearby. I regularly had to reset it by waving the phone in a figure-8, and when I set the map to auto-rotate as I turned about, it would swing from side to side a little erratically, sort of like being on a rocking boat. If boats rocked in a spinny kind of way.

I'm sure there's lots of junk data somewhere in the gargantuan data sets that go into mapping apps like this, but for the first week of an app this huge, I'd say they nailed it.

Ultimate hipster iPhone case

posted on Sep 19, 2012
iphone-cassette

When my fake-cassette-tape iPhone case wears out, I'm buying one of these.

Yes, it's real.

What a delightful system crash

posted on Sep 19, 2012

My MacBook had a complete and total failure today. Complete hang, followed by a failure to reboot, and then a frustrating inability to even get far enough into a boot sequence or diagnostic to even identify what the failure might be. So I walked it over to the Apple Store, and even though the computer was out of warranty, they hooked it up to their fancy diagnostic systems, deduced that it was a faulty HD cable, and then performed the repair. After all that, they only charged me $20, and that was for the new cable. I only waited about 10 minutes, and my computer came back with a cleaned screen and keyboard, as a bonus. I guess the fingerprints and cookie crumbs offended their hipster sensibilities. Not that I'm complaining.

All in all, the most painless catastrophic system failure I've ever experienced. Although I gotta wonder if this is killing the independent service and repair market for Macs?

How Microsoft Lost Its Mojo

posted on Jul 26, 2012

Vanity Fair has a lengthly article on Microsoft's lost decade, that tries to pin down how the software giant fumbled its seemingly insurmountable lead in the tech industry. It's an interesting read, but more interesting to me is how wrong its fundamental premises are. And if you get your premises wrong, your conclusions are sure to be irrelevant.

The faulty premise is this: that Microsoft was once nimble and cool, and somehow lost their way.

How could a company that stands among the most cash-rich in the world, the onetime icon of cool that broke IBM's iron grip on the computer industry, have stumbled so badly in a race it was winning?
... While Microsoft was once the hippest company on earth...
...the Microsoft of old, the nimble player that captured the passions of a generation of techies and software engineers, is dead and gone...
...On August 24, 1995, Microsoft reached the pinnacle of cool, releasing what would then be its largest-selling operating system ever: Windows 95. Seeking to buy the first copies, computer geeks lined up at midnight around the block outside technology stores...

Here's the thing: Microsoft was never cool. And once they grew beyond about 20 employees, they were never nimble.

  • In the 1970s they wrote BASIC interpreters. Interesting to geeks, but not cool, unless you were an electrical engineer who was getting tired of flipping bit switches to enter your assembler code.
  • In the 1980s they bought a 2nd-rate OS and convinced IBM to bundle it with the new IBM PC. Great business savvy, but you have to be pretty socially maladapted to think that DOS was cool.
  • In the 1990s they struggled to bring a GUI to the the PC, when every other major computing platform already had one since the mid-1980s. Not cool, and especially not nimble. They also sold a lot of business software. Great stuff, but not cool, unless you're an accountant. They thought the Internet was not important or interesting. This was definitely not cool of them, and it's almost shocking in its lack of nimbleness. When they realized how colossal their error was, they used their monopoly power to gut their competition, and cripple the Internet with buggy, insecure, proprietary technologies. Quite possibly the least cool, least nimble thing ever done in the history of computing.
  • In the 2000s they enabled cyber crime as we know it today, with their insecure, buggy operating system that they didn't seem to know how to fix. Cooler companies were succeeding on the Internet, so they sunk billions into that, but lost money. Cooler companies were succeeding in video games, so they sunk billions into that, but lost money. Cooler companies were succeeding in online music, so they sunk billions into that, but lost money. It was pretty clear that people doing cool things didn't want Microsoft stuff. But at least they continued to sell lots of spreadsheet software. 
  • In the 2010s people with a weak grasp of computing history started to wonder when Microsoft lost their cool. Ahem.

Now, it has to be said that cool things were happening in computing all this time, and Microsoft, as one of the big players in computing, was involved. But Microsoft wasn't really doing the cool things. Mostly they were just fucking things up when cool things happened around them.

From 1975 until 1990, the really cool thing that was happening was the personal computer. It was cool thanks to Apple, who brought out the first really popular, hackable personal computer. It was also cool thanks to IBM, who copied Apple's design philosophy, but made it much more powerful. And lastly, it was cool thanks to Compaq, who reverse-engineered IBM's design, making this power cheap and affordable to the masses.

Cheap, hackable, democratic technology. That's cool. But it was also hardware, which had nothing to do with Microsoft. Bill Gates simply rode the coattails of the hardware revolution. He got his start by writing BASIC for the Altair 8800, and this led to Apple hiring Microsoft to write AppleSoft BASIC for the Apple ][+. Meanwhile, IBM was outsourcing their new IBM PC operating system, but the cool choice, CP/M, was turning out to be problematic. Rather than suffer through painful negotiations with Digital Research, IBM said 'fuck it' and asked Microsoft if they had anything to offer instead. Microsoft said yes, even though they didn't, and quickly went out and bought a 2nd-rate CP/M rip-off that they licensed to IBM. Brilliant move on Bill Gates' part, and arguably nimble from a business perspective, but not exactly a cool or innovative technology.

When Compaq reverse-engineered the IBM PC, they kick-started the clone market, driving costs down and performance up. And because of the IBM deal, MS-DOS had lucked into becoming the industry-standard default OS on these cheap, ubiquitous computers. Money started to pour in, and Microsoft accidentally found themselves in the leadership role that was supposed to be IBM's. And as a result, they started to take credit for the PC revolution, in their own minds.

Meanwhile the actual cool kids were working on the next great thing: graphical user interfaces. By 1984, everyone else had one. Apple had the Lisa in 1983 and the Mac in 1984, Sun had Sunview in 1983, and Unix had X-windows in 1984. Microsoft was not only late to the game, releasing Windows 1.0 in 1985, but unlike the others it was basically an unusable piece of shit, little better than a technology demo. Nobody used it. They didn't use version 2.0 either. Version 3.0 actually worked, but it wasn't actually useful until 3.1, released in 1992, 7 years late. Microsoft continued to struggle to catch up with the cool kids, and finally achieved a kind of parity with the release of Windows 95, ten years after entering the GUI game. Not nimble. Not cool. But desperately needed by PC users everywhere, who did in fact line up around the block to get their copies. Windows 95 was immensely popular because in 1995 GUIs were no longer cool—they were necessary.

In fact, by 1995 the cool kids had all moved on to a new area: the Internet. Microsoft was so busy trying to address its lack of cool from the 1980s that they didn't even see this one coming. Bill Gates famously dismissed the Internet when people complained that it was a major weakness in Windows 95. And when new companies like Netscape popped up to take advantage of Microsoft's pig-headed sloth, Microsoft crushed them mercilessly with their monopoly power, setting the entire industry back years. By the late 1990s it was already clear that Microsoft was the new Bell Telephone, monstrous, sluggish, domineering, and uninterested in technical innovation.

In 2001, Microsoft achieved final victory in the browser wars with its release of IE6, one of the greatest blights on computing that the world has ever seen. It not only set the world back years in technological terms, but also enabled the entire modern world of cybercrime with its appalling lack of security.

But it didn't matter, because the cool kids had already moved on. The Internet revolution was not really happening in the browser, after all. It was happening on the Internet itself. And by 2001 it had been going on long enough to actually create, inflate, and then pop its own economic bubble. Once again Microsoft belatedly blundered into this new technological arena, this time using their massive war chest to buy relevance, first by copying AOL, then by copying Yahoo, and then by copying Google. The only result so far is that Microsoft has lost billions of dollars.

And now here we are today. Mobile technology is the newest playground for the cool kids. Apple has reinvented itself as the dominant player in this marketplace, and Google has asserted itself as the mass-market brand. And what is Microsoft doing? 5 years after the debut of the iPhone, 4 years after the debut of Android, Microsoft has decided that it should go into mobile, big time. Sound familiar?

So no, Microsoft was never cool or nimble. The historical evidence shows that it is one of the most slothful and intransigent companies in the whole technology sector. Vanity Fair asks how Microsoft will get its mojo back. But it never had mojo, there is no mojo in its DNA, and any attempt to get it back will be grasping at phantoms, wishful fantasies about an innovative, pioneering Microsoft that never existed.

More retro terminal fun

posted on Aug 9, 2011
vt220mac
Not satisfied with the awesomeness that is Cathode, JSTN at tumblr is doing it in true old school style, hooking up a vintage VT220 terminal to his Mac. (Not to mention blogging about it in a line printer font.)