Rethinking Ubuntu's Update Policy…Or Not
One of the timeless challenges of open-source development is keeping software as up-to-date as possible while also maximizing stability. With this difficulty in mind, Ubuntu’s developers recently discussed the operating system’s policy on updates. Here’s the story, with some thoughts.
Sebastien Bacher pointed last Wednesday to the dilemma of delivering Gnome updates to end-users in the middle of Ubuntu release cycles. While Ubuntu’s general policy is to save non-critical software updates for the next version of the operating system, this practice is bad for users affected by bugs in existing packages that are serious for them, but not deemed worthy of immediate fixing. The conservative update policy also means Ubuntu users have to wait longer for new application features to become available, and it’s somewhat of a slight to upstream developers, whose hard work doesn’t reach the masses as fast as they might like.
Stability, stability, stability
Bacher asked for feedback on how these problems might be mitigated. While some developers pushed for more aggressive update policies that would lower the bar for deciding which bugs are critical enough to be patched in the middle of a release cycle, a majority argued for erring on the side of stability when delivering patches to users. As release manager Martin Pitt wrote:
For stable updates we have several conflicting goals, in the
descending priority:1. keep it running
We must avoid introducing regressions through updates at all costs.
2. keep it safe
We obviously need to fix critical issues like security
vulnerabilities and data loss bugs.3. make it better
Fix major regressions from earlier releases and bugs which impede a lot of users.
Not all bugs are created equal
Rather than pushing more aggressive updates, Pitt argued for streamlining the review process of existing bugs so that approved patches reach users faster. Under current policy, all proposed updates are subject to the same quality-assurance process, regardless of their complexity or importance to the system. This means trivial changes often take longer than they should to reach end users. By factoring the seriousness of a proposed update into the procedure for releasing fixes, developers could cut some of the overhead from Ubuntu updates.
Unfortunately, no consensus seems to have been reached regarding how the review process could actually be streamlined.
Analysis
One of the factors that drove me to Ubuntu from Fedora a few years ago was the latter’s comparatively aggressive policy on updates. More than once, I found that updating my system resulted in broken wireless or X. This isn’t to say such problems don’t occur in Ubuntu–they occassionally do, if posts on the Ubuntu forums are any indicator–and maybe I was just a particularly unlucky Fedora user. But I’ve yet to suffer a show-stopping regression when installing updates in Ubuntu.
I’m therefore pleased to see the Ubuntu developers continuing to promote stability before all else when it comes to updates. Sure, it’s annoying to have to wait for the next Ubuntu release in order to upgrade to the latest applications, but that’s preferable to having the operating system break frequently due to poorly tested patches being sent downstream. And the type of users who care about having the latest version of OpenOffice or Gnome tend to know how to get it on their own, without waiting for Ubuntu updates.
Let’s hope Ubuntu’s leaders continue to think pragmatically and keep stability in mind. If they do, they just might distinguish themselves long enough from the out-of-touch geeks who dominate open-source development to bring Ubuntu to the masses.
But even pretty critical bugs become not fixed. There was a GTK Bug which prevents printing from some applications like Evince.
Google search lists a lot on it:
http://www.google.com/url?sa=tamp;source=webamp;ct=resamp;cd=1amp;url=http%3A%2F%2Fbrainextender.blogspot.com%2F2009%2F01%2Fubuntu-intrepid-too-many-failed.htmlamp;ei=zoB6Sv38GMbK_gapmuziAgamp;usg=AFQjCNEbHe6SRkyOJMGmbNio_w5H8Np6PQamp;sig2=i4EwTUjEu1o6QXZAaob0MA
And it’s number one topic on my blog, although it’s from 8.10 Edition. Lot’s of visits every day. It become fixed with 9.04.
A half year, because of not Gnome update.
http://brainextender.blogspot.com/2009/01/ubuntu-intrepid-too-many-failed.html
I do not want to blame developers for it – but this is not stability.
my 2 cents
brain extender
[…] http://www.workswithu.com/2009/08/05/rethinking-ubuntus-update-policyor-not/ […]
I read the article and wondered how much the “Force Version” feature of Synaptic might expiate the problem? In my own little world, I experienced a minor problem with the podcatcher “Gpodder” recently, and it reflected this article exactly. The fix was already sitting in the Ubuntu repository, just waiting to clear the release factory.
I agree with this articles’ basic assretion that stability for the majority of the user community is vital. But I think the answer is quite simple. Firstly, make a bigger deal over the amount of control “Force Version” gives the end user. After all, any good service deliverer will test change for themselves. Secondly, ensure this feature works well across an enterprise, not just on a single desktop. That last bit is our own check that we don’t turn into Microsoft by undervaluing the element of user choice.
I think maintaining stability is a priority, but users should be made aware of sites like http://www.getdeb.net which at least allow them to take the risk and install the latest software
“More than once, I found that updating my system resulted in broken wireless or X. This isn’t to say such problems don’t occur in Ubuntu–they occassionally do, if posts on the Ubuntu forums are any indicator–and maybe I was just a particularly unlucky Fedora user. But I’ve yet to suffer a show-stopping regression when installing updates in Ubuntu.”
That’s bizarre, since Ubuntu’s update schedule is well-known for breaking things with every release. (http://linuxhaters.blogspot.com/2008/06/evolution-of-ubuntu-user.html)
I feel like I’ve experienced major problems with every release, even when I wait a few months to upgrade, and I assume this is because they’re just shoved out the door on a specific date regardless of whether they’re actually ready.
I’m glad that they nominally prioritize “We must avoid introducing regressions through updates at all costs”, but I’m not really feeling the stability myself.
Well a Linux hater WOULD wouldn’t they.
I’ve only seen major update issues when I’ve used pre-release updates.
@Hmm: The updates mentioned here aren’t releases but mid-cycle updates. Fortunately, I have never seen regressions introduced mid-cycle, but I have had the same problems you mentioned during release upgrades. In fact, I followed Linux Hater’s evolution of an Ubuntu user almost exactly, even to the point of getting a Mac. As for why bugs and regressions are introduced at the point of new releases, there are many reasons, and the short release cycle isn’t the biggest culprit.
Christopher: The annoying problem here is that many users only upgrade to the next release — or update packages — to fix existing bugs, and they often do so not realizing that upgrading presents a real risk of introducing regressions. This is why it is important to allow users the option to easily update certain packages.
For example, when Firefox 3.5 was released, there was no easy way to install it from the repositories. To do so, one has to retain the official version and install 3.5 parallel to it and then manually tweak it to become the default. This introduces uneeded complexity and cruft. Synaptic should be robust enough to handle this sort of situation. Something like updating a single package shouldn’t be a showstopper, and we shouldn’t force users to juggle different packages.
If we are going to promote the package system as a superior way to deliver software then we must provide a way to deliver updated packages mid-cycle without forcing the user to do some CLI kung-fu.
Chris:
For completeness.. what version of Fedora were you referring to when you say several years ago? And at the time were you making use of out-of-tree kernel drivers not supplied by Fedora for your wireless and your video?
@Josh: I beg to differ. A regular user does not have an idea that Firefox 3.5 was released, nor will they care whether it is Firefox or Konqueror or Chrome, as long as they can browse youtube or log into their bank account. They will, though, feel let down if you force on them a fundamental upgrade to a newly released 3.5 (potentially with many regressions as any new major release), and some things stop working all in a sudden.
@Cris: another major issue is bandwidth. Having massive upgrades on the stable release means that people on dial up (or limited bandwidth in general) will have a very miserable life with *buntu.
Just do what Linux Mint does. They have a really neat grading system for updates. Level one is ultra safe etc. to level 5 which is dangerous.
Leo:
Someone could invest the resources in creating the equivalent of patch rpms for deb packages. Suse introduced patch rpms several years ago..and Fedora is now producing them as part of its build process for use with the presto feature. People are seeing major reductions in the amount of material needed to be downloaded with presto on the order of 80% .
Case in point:
http://dodoincfedora.wordpress.com/2009/08/05/how-awesome-is-this/
I haven’t done the full analysis yet…but I suspect that even just security updates for the kernel and firefox see a significant bandwidth savings with the patch rpm concept that presto uses. So even users who only care about security updates on Ubuntu could benefit from this sort of technology adapted for debian patches.
Jef: I used Fedora in the days of FC 4-6. My wireless driver was bcm43xx, which I believe was in the Fedora kernel, at least by FC 6. I had nvidia video so I was probably using drivers from livna (not positive), which I guess aren’t technically Fedora’s responsibility, but livna was an essential resource for anyone who wanted a functional system (not sure if that’s still the case).
I know that Fedora made some significant changes to its release and support policies around the time Fedora 7 came out, so the experience I had with overly aggressive updates may no longer be relevant. But it was enough to push me to Ubuntu without looking back.
Chris:
yes.. out of tree drivers. I suspected that, but I wanted to be sure. I think its more than a little disingenuous to be upset that Fedora updates broke proprietary driver functionality you were relying on from outside the Fedora project. Fedora’s been pretty upfront from the beginning that proprietary drivers were a non-blocker hardware issues.
And please, cut back on the hyperbole. I can assure you there are people out there who never needed out of tree kernel drivers in the era of FC4 through FC6 to have a fully functional system. If you felt they were essential for your hardware, fine, there can be no argument as to what you felt you needed for your own situation. But don’t ratchet this up to blatant propaganda by making the overly broad claim that noone could have a fully functional system without out of tree drivers. I am a counter example to that assertion and I am not alone.
There is a deep point here that I feel needs to be reiterated concerning how open development and proprietary drivers interact. Relying on proprietary drivers is a choice you make as a user and as a consumer. I will not argue that Ubuntu puts greater emphasis on supporting proprietary drivers when Fedora does not. It’s a choice that Ubuntu has made and if Ubuntu’s policy with regard to proprietary driver support works best for you.. then you should be using Ubuntu.
But lets be clear…supporting the users of proprietary drivers equally with the users open drivers has real costs..costs that the users of open drivers pay and users of proprietary driver uses do not.
Proprietary drivers create all sorts of difficulties in terms of balancing the needs of some users against the needs of others when choosing how to handle hardware related updates. Open source distributions cannot fix regressions associated with proprietary drivers, so when updates come down the pipe which fix important issues in open source drivers the distribution has a tougher choice to make if they must also consider the needs of users who have chosen to use proprietary drivers.
Regressions and bugs in proprietary drivers are inherently unfixable by the distribution developers and the upstream open source developers whereas bugs and regressions in open drivers are fixable. Waiting on the developers of proprietary drivers to fix their own code, distorts how the open development process for the other users who benefit from the faster rate of development that open development allows for. What if the linux kernel itself was proprietary and you and every other Ubuntu user had to wait for the keepers of that code to fix non-functional hardware issues because your chosen linux distributor did not have access to the code. Would you be satisfied with that? I’m not sure you would.
A zero regression policy coupled with implicit support of proprietary drivers…greatly limits how a distributor can respond to upstream open source advances. So as a result the users who have made a conscious effort to buy hardware that uses open source drivers see issues fixed more slowly to ensure that users like yourself, who make use of proprietary drivers see no regressions at all.
I can’t tell you whether to think that is a fair trade-off or not. I suspect the perceived fairness of treating proprietary drivers equally with open drivers in a zero regression policy depends a lot on whether a person is using open drivers or proprietary ones. Want to make any bets as to whether Shuttleworth uses proprietary drivers on his day-to-day Ubuntu systems?
-jef
@Jeff: excellent point, though I would believe this is beneficial for small patches (as in security patches where maybe you add bound checking in some array access), not if you have a major version upgrade where the binaries will be almost _completely_ different. It would still be beneficial for the existing updates, though.
Last week that update to the new kernel killed my sound completely. Something like that should never have slipped through. I know it screwed a few other people too.
As for them fixing major regressions, well Ubuntu still haven’t fixed all the regressions that were introduced with Jaunty’s release (low volume on Acer laptops, non-functioning volume dial, bad video performance).
Soooooo, to me their policy on updates is flakey at best.
Leo:
I think once you take a closer look, you’ll see that a lot of the time even across point releases of a project over a 3 months of project development..a lot of the package payload is still very much the same. I think when its all said and done the differentials are still quite small. But as I’ve said I haven’t done the full analysis yet for the Fedora 11 update collection. I need to talk with the KDE maintainers as they have pushed a major point release update through in Fedora.
Though this particular point is almost moot with regard to Ubuntu as Ubuntu is so conservative that this particular issue would seldom come up. At best you could take the initiative to examine the backports for hardy I guess and try to guage what this sort of tech would bring in terms of bandwidth savings.
-jef
slumbergod:
do you have a bug report url for reference for the kernel issue?
Jef: the nvidia driver that broke in Fedora was proprietary, but bcm43xx is not. So regressions in its performance can’t be blamed on proprietary development and the problems associated with it.
Also, to be clear, the reason I say you needed to use livna to have a functional Fedora system is that it contained not only proprietary drivers, but other non-free software that most modern computer users would consider essential, like media codecs. I’m sure there are some people out there who never play mp3s or watch youtube, and I agree that in an ideal world all codecs would be open, but until they are, places like livna are essential for most users.
Chris:
You need to decide what you want to talk about. If you want to talk about hardware breaking on updates..we can talk about that. if you want to talk about codecs..we can talk about that. Different issues entirely. It’s a little unfair to slide-in codec support as the “essential” piece of software in an ongoing conversation about hardware breakage during updates. Not cool…not cool at all. That’s the sort of thing that turns a discussion into an argument.
Back to hardware support. Yes some form of bcm43xx support was in the kernel so my point about proprietary drivers does not apply to your wireless issues.
But I will say this, if you look at the historical record, there is a prominent Red Hat/Fedora developer who was extremely active in pushing the wireless stack forward:
https://fedoraproject.org/wiki/User:Linville
In fact he’s the upstream kernel wireless LAN maintainer!
http://lkml.org/lkml/2006/1/10/458
That lkml thread is worth reading. Wireless was very much in a state of disorganization in that time period. And a Fedora devlopers stepped up and took the responsibility for making it better…for everyone. He took over as the upstream maintainer in the FC-5 testing timeframe..right during the time when you would have had access to his expertise as a user. If you were having problems in Fedora, and made the effort to communicate those problems he would have been the person most likely able to find the fix and get it fixed quickly.
The more unstructured update policy that Fedora has is both a blessing and a curse as it leaves much of the decision-making to the discretion of the package maintainers. It introduces more chances for regressions, but it also provides more opportunity to get fixes to users faster. Fedora does take regressions seriously and has the manpower and expertise to respond to problems when they occur in the code that is shipped. Does Ubuntu? Remind me again, which Canonical developers are maintainers for upstream codebases that are critical for hardware support.
-jef
Jef @ 20: Chris turning a discussion into an argument? I respectfully beg to differ.
Chris has emerged as one of the Ubuntu industry’s most balanced, even-handed bloggers. And he also works hard to answer reader comments in a timely, respectable manner.
We welcome continued constructive criticism from you and WorksWithU’s readers. But I don’t see evidence of Chris turning conversations into arguments.
Joe Panettieri
Editorial Director
WorksWithU
Joe:
Feel free to disagree. But I’ve been married long enough to know with great precision exactly which rhetorical devices have the highest probability of taking a discussion over the line..especially when they are used unintentionally. My wife could most likely successfully argue that I’ve made a detailed statistical study of the effect.
If Chris wants to have a discussion about the patent landscape which keeps some “essential” open source software like codecs at arms length for distributions. We can do that..but its orthogonal to a discussion about how updates policy impacts hardware functionality.
-jef
I’m still waiting for the repositories to separate programs and the OS. These are different things and should be treated differently.
Firefox should be upgraded to 3.5 by now, at 3.5.2, when most plugins work fine. OpenOffice.org’s 3.1 release should have been in the updates months ago.
If a program doesn’t work, hold it back. Duh. But these programs do work. They’re better. They’re faster. They’re stable. Why the heck are we intentionally holding them back when they will improve the Ubuntu system?