Public
Technology: What ails the Linux desktop? Part I.
The basic failure of the free Linux desktop is that it's, perversely, not free enough.
There's been a string of Linux desktop quality problems, specific incidents reported by +Linas Vepstas , +Jon Masters , +Linus Torvalds and others, and reading the related G+ discussions made me aware that many OSS developers don't realize what a deep hole we are in.
The desktop Linux suckage we are seeing today - on basically all the major Linux distributions - are the final symptoms of mistakes made 10-20 years ago - the death cries of a platform.
Desktop Linux distributions are trying to "own" 20 thousand application packages consisting of over a billion lines of code and have created parallel, mostly closed ecosystems around them. The typical update latency for an app is weeks for security fixes (sometimes months) and months (sometimes years) for major features. They are centrally planned, hierarchical organizations instead of distributed, democratic free societies.
What did the (mostly closed source) competition do? It went into the exact opposite direction: Apple/iOS and Google/Android consist of around a hundred tightly integrated core packages only, managed as a single well-focused project. Those are developed and QA-ed with 10 times the intensity of the 10,000 packages that Linux distributions control. It is a lot easier to QA 10 million lines of code than to QA 1000 million lines of code.
To provide variety and utility to users they instead opened up the platform to third party apps and made sure this outsourcing process works smoothly: most new packages are added with a few days of latency (at most a few weeks), app updates are pushed with hours of latency (at most a few days) - basically it goes as fast as the application project wishes to push it. There's very little practical restrictions on the apps - they can enter the marketplace almost automatically.
In contrast to that, for a new package to enter the major Linux desktop projects needs many months of needless bureaucracy and often politics.
As a result the iOS and Android platforms were able to grow to hundreds of thousands of applications and will probably scale fine beyond a million of apps.
(Yes, we all heard of the cases where Apple or Google banned an app. Don't listen to what they say but see what they do: there's literally hundreds of thousands of apps on both of the big app markets - they are, for all practical purposes, free marketplaces from the user's point of view.)
The Linux package management method system works reasonably well in the enterprise (which is a hierarchical, centrally planned organization in most cases), but desktop Linux on the other hand stopped scaling 10 years ago, at the 1000 packages limit...
Desktop Linux users are, naturally, voting with their feet: they prefer an open marketplace over (from their perspective) micro-managed, closed and low quality Linux desktop distributions.
This is Part I of a larger post. You can find the second post here, which talks about what I think the solution to these problems would be:
https://plus.google.com/u/0/109922199462633401279/posts/VSdDJnscewS
The basic failure of the free Linux desktop is that it's, perversely, not free enough.
There's been a string of Linux desktop quality problems, specific incidents reported by +Linas Vepstas , +Jon Masters , +Linus Torvalds and others, and reading the related G+ discussions made me aware that many OSS developers don't realize what a deep hole we are in.
The desktop Linux suckage we are seeing today - on basically all the major Linux distributions - are the final symptoms of mistakes made 10-20 years ago - the death cries of a platform.
Desktop Linux distributions are trying to "own" 20 thousand application packages consisting of over a billion lines of code and have created parallel, mostly closed ecosystems around them. The typical update latency for an app is weeks for security fixes (sometimes months) and months (sometimes years) for major features. They are centrally planned, hierarchical organizations instead of distributed, democratic free societies.
What did the (mostly closed source) competition do? It went into the exact opposite direction: Apple/iOS and Google/Android consist of around a hundred tightly integrated core packages only, managed as a single well-focused project. Those are developed and QA-ed with 10 times the intensity of the 10,000 packages that Linux distributions control. It is a lot easier to QA 10 million lines of code than to QA 1000 million lines of code.
To provide variety and utility to users they instead opened up the platform to third party apps and made sure this outsourcing process works smoothly: most new packages are added with a few days of latency (at most a few weeks), app updates are pushed with hours of latency (at most a few days) - basically it goes as fast as the application project wishes to push it. There's very little practical restrictions on the apps - they can enter the marketplace almost automatically.
In contrast to that, for a new package to enter the major Linux desktop projects needs many months of needless bureaucracy and often politics.
As a result the iOS and Android platforms were able to grow to hundreds of thousands of applications and will probably scale fine beyond a million of apps.
(Yes, we all heard of the cases where Apple or Google banned an app. Don't listen to what they say but see what they do: there's literally hundreds of thousands of apps on both of the big app markets - they are, for all practical purposes, free marketplaces from the user's point of view.)
The Linux package management method system works reasonably well in the enterprise (which is a hierarchical, centrally planned organization in most cases), but desktop Linux on the other hand stopped scaling 10 years ago, at the 1000 packages limit...
Desktop Linux users are, naturally, voting with their feet: they prefer an open marketplace over (from their perspective) micro-managed, closed and low quality Linux desktop distributions.
This is Part I of a larger post. You can find the second post here, which talks about what I think the solution to these problems would be:
https://plus.google.com/u/0/109922199462633401279/posts/VSdDJnscewS
View 62 previous comments
It's the diversity of its software ecology that makes Linux-based distributions so unruly. Why are we still talking about the "Linux desktop" in a world where the most common personal computer in use today is a handheld communications device. Over 700,000 Android devices, each with a Linux kernel, are activated every day.Apr 6, 2012
+Ventak HR I completly agree that Apple has the most restrictive platform policies e.g. this harvard paper "Opening Platforms: How, When and Why?" http://www.hbs.edu/research/pdf/09-030.pdf page 2, has this order of openness (from open to closed): linux > windows > macintosh > iphone.
On the fuzzy argueing about "not working installers in windows", such happens effectively not anymore nowadays, "dependency hell", installer problems etc. were fixed years ago (around 2000). The desktop usage experience, it must be said like that, is plainly great with Windows.
"This means that for the sake of "it just works" Apple and Google take total totalitarian control over their App Market delivery system and can pull or disapprove apps and control and limit the evolution of those apps forever."
This argument is supporting Ingo. Ingo ask why the Linux ecosystem is not providing an uncontroled market platform ('the bazaar'), completly free of political control, unlike what these markets suffering now from apple or google.
Ingo thinks "just works" and free as in "free speech" are not in conflict and I completly agree. The model we have (or "suffering" according to this point of view:http://news.ycombinator.com/item?id=3716884) now with the centralized distro approach is far less free than his vision. The control the distributions have over the repositories is maybe not commercial motivated, but it IS a form of controlling and limitation and also an practical hinderance for direct distribution of software to the users. This was criticised multiple times already e.g. by Mike Hearn (autopackage) years ago: [...] Hearn has critiqued native packaging systems as both out-dated and anti-democratic."The whole idea of packaging/installation is bogus and leftover from the times when software was distributed on floppy disks," Hearn claims. "The web 'instant activation' model is better [...]" (http://web.archive.org/web/20080331092730/http://www.linux.com/articles/60124)
"An enterprise Linux distro company like Red Hat has the right and the need to maintain it's own ecosystem for it's own engineering reasons" Here I could not disagree more. The priorization, it is completly OK for distributions to break compatiblity and platform experience for minor technical advantages is bogus, misfocussed and one of the reasons for the problems the linux desktop is suffering today.Apr 6, 2012
@mandrit Hoffman, you make several really good points, but...
...In reality it's all about "use cases", that's what Linux is really all about. Each distro adapts itself to a series of use cases which are different and not compatible all the time. The problem is that each use case changes rapidly or becomes obselete and new ones appear all the time. If you watch this interview with Linus Torvalds himself: http://youtu.be/ZPUk1yNVeEI, you will see that the real problem is that there are simply too many desktop configurations hardware wise and use case wise and this is the difficulty of the desktop.
It's a matter of combinatrics. Let's examine that in depth. How many hardware devices does the desktop need to support, printers, input devices, hard disks, motherboards, chipsets, ram, etc? How many different types of users would it have to support, novice, media editor, application/system developer, web-developer, office worker, student, web surfer, tablet user, etc? How many different systems will it run on, sever, mobile, tablet, netbook, laptop, desktop, towers, thin clients, embedded, cars, planes, trains, tanks, helicopter subsystems, nasa space ships, etc? How many different output devices will there be, vision impaired, braille keyboards, large lcd monitors, crt monitors, touch screens, televisons, etc...?
Scenarios = X * Y * A * B * C * E * F * S * H * F *(....) = HUGE NUMBER!
One can take anything above the Kernel level, that is those items that deal with the Userspace and use case specific elements and throw it away, refresh and start over again to adapt for each specific scenario or group of scenarios and this is one of the core brilliant aspects of the Linux system. This way Linux desktops can accomodate huge ranges of deployment and use scenarios and adapt in a purely objective independent fashion. This is why Linux is used in reality on probably over 70% of the world's computing devices including mobile, desktop, embedded and servers because of this deployment characteristic.
On the Developer level it's part of our job description and basic application design that one has to adapt and make as many deployable combinations of our applications as possible for these different scenarios. We developers tend to be lazy people and so the unified approach seems great, but is out of step with the wider reality that faces in terms of raw numbers. This is like a developer complaining that all computer hardware manufacturers should only use Intel i386's because it's too annoying to have to develop for AMD and ARM processors and duplicates efforts. It's not practical or even possible for developers to tell System Makers or Operating System distributors/developers to adjust themselves to our needs because we want to be lazy.
We have to take the tools available, be creative and ensure that anyone using any system or operating sytem can use our software and that it's operation is uniform accross all devices/scenarios. If we can't do that we have failed as application developers, it's that simple, no need to blame distros, manufacturers or anyone else but ourselves. For example, if someone wanted to deploy an office producitivy software package over a network at a large corporation and enable the workers to access this over the internet this deployment, packaging and dependencies would be very different from someone who just wants a word processor application to do their homework or write a letter on their laptop. This is a reality, not a hypothetical.
This is the reason why having .rpm's and .deb's, yum, apt-get and even simple make from source and these are all necessary as tools that are great for a particular scenario and that's why they exist. The utter failure of Microsoft and Apple with their respective desktops is that they both failed to understand this wider ecosystem diversity and forced all devices to use the same systems in a centrally unified approach. Sure this allowed for a big single wide open application market ("app store") but one that could only deploy applications for a limited set of scenarios, desktops and laptops. This held back the evolution of different systems and diverse deployments. It would take an entire NT Kernel rewrite and upgrade to facilitate such deployments to adjust for major scenarios and this is why Microsoft and Apple were never able to create any meaningful solutions for the server, embedded or mobile market.
Apple is only just beginning to get into the mobile/tablet game but their systems - aside from the brilliant artistic endeavours of their terrific user interface design team - are technolgically inferior and less capable then comparable Linux derived mobile systems and not open enough to operate in the wild efficiently. Also, Microsoft is now facing their own failures head on with the Windows 8 development fiasco because they realize that their own unified approach failed because their Kernel is specialized for desktops and applying that to mobile touch screen tablets is difficult and requires too much hacking. If they had developed parallel "ecosystem" based systems that enabled a modular approach as the major Linux Distro communities (Canonical, Debian, Fedora, Suse, etc...) and companies have with the different approaches to different or similar problems, they would have deployed tablet based OS distributions years ago and be in a good position going forward.
Likewise, the failure of the wider Linux Distro community to adopt "Autopackage" and to develop any meaningful alternative speaks volumes about the core issue here. In reality the only way to deliver your "an uncontrolled market platform" is via a single unified and centralized system and to use a word from the Bitorrent world a "tracker" system. With Bitorrent, sure you can use many different clients and feel that you have a decentralized market but in reality it is still centralized in one core protocol and this would impose centralized requirements on the Distro and would be a move that would deprive the distro of it's freedom to adapt itself for it's own particular use case which is impractical and impossible given the high number of unique use case scenarios and deployment systems in use.Apr 7, 2012
@Ventak HR : ok, your concern seems to be the varity of use cases an operating system faces. Agreed, this is a demanding task... but I didn't agree on your conclusion that this must be achived on distribution level. You argue also the varity of distros is required (technically!) to fulfill the needs of all users... I would argue this approach as solution for adaption on all the potential use cases is doomed to fail for two reason: first, complexity and burden. I would argue the primary requirement of an operating system to adapt on all potential hardware is task enough (and undoubtfully not a solved problem) for the exidting distributions. They should not be burdend with the additional sisyphusean task of trying to "forsee" the use cases of their user group. I take your combinatoric argument and cite Ingo "the cathedral can't scale beyond 100packages", this exploded has ... even with your assumed specialized, reduced user group the task is too enourmous. Second argument, distributions will always provide only an approximation of the real use case of an user or user group, because individual preferences exist, change, etc ... in the end for perfect adaption on the use cases we would need as many distributions as users.
But wait! ....Apple and Microsoft found a way to achieve exactly that, a distribution for every user, in a non-painful way for both sides. How they achieved that? ... They didn't try to assume the uses cases of the users beforehand (that's wasted effort!), they just gave them the freedom to adapt with "modules" called "applications" their OS to their very specific use cases! They allow the users to build their own "user distribution" with maximum flexibility possible... so, every user has is own distribution, all use cases are fulfilled, great! Additionally the burden of work on the distribution would be (roughly approximated) halfed by taking away the ugly decision taking and packaging of applications, maybe we will than have an better, perfect hardware support? Better polished desktops, better usabilty etc... just by freeing development ressources for the distros! Would that not be great?
Therefore, I agree with Ingo totally , this is the only way to go, let the users take their own software decisions, take away this decision from the the central place called "distribution" because even the ten-thousend packages big ubuntu cathedral is orders to small to satisfy the varity on needs on applications and funky use cases users like to have. (noted also with some examples by benjamin smedberg: http://benjamin.smedbergs.us/blog/2006-10-04/is-ubuntu-an-operating-system/) Distributions should focus again on their core concern, being a operating system, being an as much as possible adaptive layer between hardware and software, that's already task enough.Apr 9, 2012
Sorry, but did somebody claim that Microsoft and/or Apple had produced satisfactory operating systems? Don't think so.
Install a Linux kernel by whatever method you like. Add whatever userland you like. Add some apps. Get on with your life.Apr 9, 2012
+Tony Sidaway in sense Ingo was describing, the "degree of freedom given to user for software decision", yes, both are more successful & satisfactory than the linux distro approach. (not to mention the voting by "feet"... the market share numbers)Apr 9, 2012
Commenting is disabled for this post.