Okay, not Netcraft, but recently people have been reluctantly admitting that desktop Linux is dead. I think it took the tablet wars to finally wake them up to the fact that the day of the desktop is done. The real energy in the industry is pushing in other directions. And since Linux didn't win the desktop war, it must have lost it.
The thing nobody in the Linux community wants to admit is that desktop Linux died in 1997. That was about when RedHat bundled FVWM95 as the default window manager in RedHat 5. Before that, Linux was targeting the high-end unix workstation market, to devastating effect. Not a single major unix vendor survived that era with its value intact. Only Sun really survived at all, and limped on for a while before being bought out by Oracle.
Linux was a ferocious unix-killer because it provided 90% of the value for 10% of the cost. Its desktop was a unix workstation desktop--large, not much going on cosmetically, but versatile and highly functional for professional, technical users. Professional workstations used to start at about $15-$20K. Linux let you pick one up for a couple thousand brand new, and less than that if you were recycling an old PC.
What RedHat and the FVWM95 travesty did was definitively re-orient Linux from a unix killer to a wanna-be Windows PC killer. The target was no longer professional, technical users, but discount-conscious consumers. But now they were offering 90% of the value of a Windows PC at 90% of the cost of the PC. This was hardly the value equation that allowed Linux to utterly destroy the unix workstation market. It was hardly a value equation at all, and Linux advocates found themselves having to differentiate themselves using esoteric ideas about software freedom, which casual computers users don't understand and really don't want to spend time thinking about.
What is more, most PC users don't really know how to use computers, which is why the Windows UI is such a cocked-up dogs' breakfast of broken metaphors and poorly-implemented rip-offs of other companies' ideas. Half of the bad design decisions in Windows can be traced to the fact that early versions of Windows were run on low-resolution 14" monitors, which isn't big enough to support a proper desktop metaphor. Start buttons, taskbars, maximized windows, monolithic apps, and poor drag-and-drop support, all of these Windows disasters stem from the fact that a person cannot really multitask on a small monitor because you can't see two things at once and still work on them both. (Same problem you have on smartphones today, really.) But since unix workstations typically used high-res 17-21" monitors in the 1990s, they did not have such problems.
Re-orienting Linux to target Windows users was a colossal strategic blunder. It squandered much of the goodwill previously gained in the professional ranks, and moved major parts of the operating system and user interface backwards in an attempt to appeal to consumers who were accustomed to working on crappy machines. And somehow, Linux advocates imagined that these users, accustomed to substandard computing experiences, would embrace a clone substandard computing experience that didn't even have the proprietary software that they had learned at a great expense of personal time and frustration. FVWM95 was the canary in the coal mine.
The final nail in the coffin was when Apple released a full-blown unix OS that attempted to better Windows, not ape it, and which did support the major proprietary software packages. That was 10 years ago. The last shovelful of dirt on the grave came with OS X on Intel processors, which basically meant that you could run Linux apps natively and Windows apps in fast virtualization or in multiboot, meaning that it was the ultimate compatibility machine. That was 5 years ago. Desktop Linux has been dead a long time.
A superior long-term strategy would have been to differentiate Linux from Windows (and OS X, for that matter) by taking up the mantle of the high-end unix workstation, and presenting Linux as the elite computing platform, the true successor to SGI and Sun. In hindsight it would not have changed the actual marketshare equation much, and it would have produced a superior computing platform that would be ideally poised to succeed in the post-tablet world.
Why would an elite desktop platform succeed where desktops in general are dying? Because desktops are dying in the 90% of the market that doesn't give a crap. Desktops like Windows or Gnome or Aqua are not a feature for these people--they are an annoyance that must be tolerated to get something else done. These people will migrate to iPads and the like, and the desktop marketplace will increasingly be dominated by professionals. Just like it was back in the 1990s, when Linux was destroying its competitors.
I probably spend more time in Linux than in OS X, and the current releases are pretty damn good for what they are trying to be. But what they are trying to be isn't relevant anymore. Watching Linux trying to master the desktop is like watching Microsoft trying to produce a mobile OS called "Windows". You can only shake your head and hope that one day they will join the rest of us in this decade.
But here's the thing: the death of the Linux desktop might very well be the best thing that ever happened to the Linux desktop. The iPad has shown us the future of popular computing, and it is utterly unlike the old desktop. Doubtless the Linux clone crews are already hard at work ripping it off, using Android or ChromeOS or Ubuntu Touch or some other Linux-based follow-on, and that's fine. Because if they can bleed off the Linux advocates' urge to appeal to non-technical consumers, the Linux desktop might be able to get it back to its roots as a professional workstation. And independent computing might be able to get back to its proper role of advancing the state of the art, instead of retarding it.