Embracing the "Meta Release Cycle"
One of the greatest problems hindering desktop Linux is its diversity. With 800+ distributions, dozens of user interfaces and lots of different applications that do the same things, it’s no secret that the Linux world is convoluted for developers and users. But if Mark Shuttleworth has his way, the free-software community might become a little saner.
In an interview with derstandard.at, Shuttleworth discussed ongoing efforts to coordinate Ubuntu releases more tightly with Debian’s development cycle, hoping that other distributions and upstream projects will follow suit. With the adoption of a “Meta Release Cycle,” he argued, the efforts of all free-software developers could be made more effective by allowing the latest versions of applications to travel downstream at the same speed.
Benevolent domineering?
Some may see this move as an attempt by Ubuntu–which has long faced hostility from geeks who resent its success or focus on bringing normal people into the Linux fold–to domineer the Linux world. To a certain extent, this may be true: if Ubuntu tells upstream projects to adjust their work cycles according to Ubuntu’s agenda, many developers are going to have little choice but to comply, if they want their applications to be kept in sync with the world’s most popular Linux distribution.
But a little benevolent domineering is exactly what the Linux community needs in order to move beyond the organizational mayhem that currently impedes its progress. If other projects follow Ubuntu’s lead and work together, we could finally enjoy a situation where all the latest stable software is released at the same time, instead of forcing distributions and users to sacrifice features for reliability, or vice-versa.
And while Ubuntu may seem to be using its influence to dictate terms to other projects, no one is under an obligation to comply. If developers choose to stick to their own schedules without taking the needs of other groups into account, that’s their choice. But it will likely mean being left out of the mainstream free-software ecosystem.
Of course, coordinating development is easier said than done. For one, I’m wary of Debian’s ability to meet a potential agreement to coordinate freeze dates with Ubuntu, given Debian’s tendency in the past to miss deadlines. Shuttleworth and Canonical should be careful not to let the meta release cycle impede Ubuntu’s impressive historical success in rigorously adhering to development roadmaps.
Despite the difficulty of enforcing coordinated release cycles, however, they are a worthy goal that will benefit all Linux users, whatever their distribution of choice. Let’s hope Shuttleworth’s newest vision becomes reality soon.
This “release cycle” is the thing that causes my computer to break every six months, right? And then the major bugs aren’t fixed until six months later, at which point my computer breaks in a bunch of other ways?
Mine doesn’t break every six months because I resist the temptation to join the release date download frenzy. I wait a couple of months and enjoy all the fixes that have settled out of the fray. I guess you could say it is a six month cycle that includes a hiccup, eh? 🙂
I, too, quickly tired of Ubuntu’s 6-monthly instability feature. I moved my systems back from Intrepid to the latest LTS (Hardy 8.04) and have enjoyed rock-solid stability ever since, and am missing precisely zero of the subsequent features. Compiz works, my graphic card works, networking works, security updates work, everything just works.
I’ve since vowed to only upgrade to LTS versions, and then only once the LTS version has been live for three months.
Canonical really should be pushing LTS as the default option for new users. Only experienced users should be using the non-LTS releases.
What’s hindering Desktop Linux is not diversity, but a lack of unity and coordination. It’s OK if there are 50 ways to do something, as the motivation of many developers to write code is to make software to scratch their own itch. What Desktop Linux needs is for someone to stand up and get some of the best pieces of software to work together optimally, so that a unified operating system can happen. This is where the need for good leadership comes in, and Mark Shuttleworth is doing a good job getting the bits to work together better.
As for the meta release cycle, Ubuntu isn’t doing any arm twisting. You imply that Ubuntu is demanding that other distributions conform to it’s release plans, but Mark has said that he is willing to delay the next Ubuntu LTS in order to find consensus on a meta-cycle with other distributions. To quote Mark “The LTS will be either 10.04 or 10.10 – based on the conversation that is going on right now between Debian and Ubuntu. So we’ve given up some control when our LTS will be,…” Source: http://derstandard.at/fs/1246541995003/Interview-Shuttleworth-about-GNOME-30—Whats-good-whats-missing-what-needs-work
Also, Debian can delay all it wants for stability and still keep with the agreed upon schedule, because Ubuntu and Debian are agreeing on a freeze date, not a release date. Debian has never had a problem picking which version of the Linux Kernel to ship, but takes it’s sweet time polishing it. A freeze date is just a day when the final versions of important applications are set in stone, so that different distributions can work together on patching and testing the software that will be included in their next release. This also helps upstream developers to coordinate with distributions to decide which version of their software will be shipped. If upstreams know when this meta-freeze will be, they can make sure that the best version of their software is ready for it. This will go a long way in helping to create a unified Desktop software stack.
Over the years we have seen a pattern emerge regarding rlease schedules of distributions. I agree with Christopher that changing the schedule does little to change the current state of affairs. Some people want the latest and greatest features while some don’t. I suppose that is why we have the LTS and normal releases.
Meta distributions should, and do, have longer release tracks as with Debian, Slackware, with few exceptions like Fedora. Perhaps having upstream projects that are used in meta distributions following a rational two- or more-tiered approach where you have a long-term stable release that is feature stable and another that introduces new features will suffice. Most of the problems we have all seen are with, for example KDE stopping development of the perfectly good KDE 3 that was stable and shifting gears to releasing KDE 4 when not all of the KDE packaged applications previously available in 3 were available and working (Reason I now use not-so-user-friendly Slackware 12 on a regular basis).
We can continue to beat or chests protesting the overarching problem of fragmented effort in development of free software, but the problem will not go away. We should instead help direct the effort so that contributions are synergistic. Rather than reinventing the wheel we should consolidate the state of the art in Linux upstream from derivative works to meta distributions to individual projects, which is in the spirit of the various GPL and F/OSS anyway.
Personally, I appreciate that new features are rolled into 6 month chunks, otherwise to get new features I would be using a testing release as I did in Debian, or compile my own applications as I did in Slackware and Mandriva (in its Mandrake days). My opinion is that those that have fear of upgrades or that don’t have the time to debug problems and reconfigure their systems stick to long-term releases.
[…] http://www.workswithu.com/2009/07/14/embracing-the-meta-release-cycle/ […]
Hello,
I think Debian is a big hindrance to Ubuntu.
Here is an one example why:
https://bugs.launchpad.net/ubuntu/+source/imagemagick/+bug/348862
Debian developes slowly and its software is often too old and useless.
Lets not forget that Fedora follows Ubuntu a few weeks later. Its not like everyone would be bending to accommodate Ubuntu alone, they also get the benefit of fitting in with Fedoras release cycle as well.
In general there are two approaches being used in distribution releases. There is the Debian/Slackware approach of “We will release it when it is ready.” and the *buntu approach of “We will release it every six months come hell or high water.” Both approaches have their good points and their bad points. Software tends to be dated with the first approach. But the second tends to introduce instability (hell) in the system and a rush to get things done before the deadline for a release (high water). Additionally, one cannot install once and use the supplied system update tools to keep the system up-to-date forever. Oh, they claim you can. But I find this approach does not work in practice. And there are built-in cut off dates beyond which no more updates will be supplied.
Because of these things (and because KDE 4 currently lacks features I use in KDE 3.5), I use Kubuntu 8.04, and will not “upgrade” until KDE 4.3 is available as a standard part of an install. But I will not be upgrading to Kubuntu version whatever dot whatever. I’m tired of having to reinstall my system just because the version I started with will be abandoned by its maintainers after such-and-such date.
There are some distributions that have combined the two approaches to updates by releasing software when it is ready and making all updates available to all users, regardless of what version of the distribution they originally started with, allowing per package updates.
That’s what I want. One install and updates (not just bug fixes) as they become available without my “version” being orphaned after an arbitrary date.
The whole idea of unifying everyone’s development release cycles to accommodate the master distro is ridiculous and unnecessary. PCLinuxOS has it figured out. Why can’t Canonical?
People aren’t hostile to Canonical because of their “success” or interest in “normal” users (who is normal? what does that mean?). They’re hostile because Canonical tends to stomp on the toes of everybody around them just because they can afford to.
Even the commercial vendors like Redhat and Novell try to play nice with the wider community, because it’s good for business. Canonical has a “can’t fail” mindset because they’re being bankrolled by a rich kid, and it’s made them rather snobbish. It’s subtle, but they all act to some extent like “we’re rich so we are important, you have to do things our way or else we’ll just ignore you”. Usually while talking a lot about their ‘needs’ and ‘community focus’ and how they are much more ‘community focused’ than you are. It tends to annoy people.
(Their “success” is mostly a successful marketing drive that makes a fantastic loss each year – it’s hard to fail at marketing when you don’t have to break even. Geeks in general don’t resent that, but neither are they impressed by it)
@Andrew I am puzzled at the toe-stepping comment considering that every distribution has projects that they originate and projects that they contribute to. Because of the structure of open source development, there will always be a tug of war with respect to the direction of development. For instance projects that are pivotal to the implementation of the desktop like the kernel, Xorg, network manager, dbus and so on, will always have a great deal of noise because everyone has a vision that is independent of everyone else. The larger contributors do not necessarily win out and the best ideas are not always accepted, which is why projects fork. There is nothing wrong with forks given that they present an opportunity to refine features that get merged back into the mainline.
Sometimes these guys deserve a break considering Red Hat, Novell are publicly listed and Canonical is a private commercial company. It costs money to make software even for the little guy tinkering in the basement. One thing they have in common is that they are trying to make the very best products even is their strategy do not always agree with public opinion. However, they cannot operate on the same schedule or commit the same kind of resources so different patterns of development and release emerge. When you merge commercial interest with benevolence then you arrive exactly where we are today — there is only one result, which is an generally agreed upon release cycle.
I have a simple rule to upgrading. Use what you need. If your current OS meets your needs why bother rushing to upgrade to the next version?
Upgrade according to your own agenda. FOSS is supposed to free people to use technology as they see fit.
I echo these comments below in agreement with them, and I also support Mark S’s idea of a Meta Cycle Release.
I do also agree that waiting too long causes old software to come out, yet having the latest and greatest causes issues with stability as well. This issue has already really been addressed in that LTS versions on the stable versions/builds, and the other releases are not. That is why, like another poster said, I wait for 3 months after an LTS release to install it. I do think that Canonical and Ubuntu have a great idea with trying to set some sort of standardization as far as freezing/release dates go, just as long as they don’t become another tyranical company like Microshaft.
Andrew Oakley Says:
July 14th, 2009 at 6:00 am
I, too, quickly tired of Ubuntu’s 6-monthly instability feature. I moved my systems back from Intrepid to the latest LTS (Hardy 8.04) and have enjoyed rock-solid stability ever since, and am missing precisely zero of the subsequent features. Compiz works, my graphic card works, networking works, security updates work, everything just works.
I’ve since vowed to only upgrade to LTS versions, and then only once the LTS version has been live for three months.
Canonical really should be pushing LTS as the default option for new users. Only experienced users should be using the non-LTS releases.
Omegamormegil Says:
July 14th, 2009 at 7:31 am
What’s hindering Desktop Linux is not diversity, but a lack of unity and coordination. It’s OK if there are 50 ways to do something, as the motivation of many developers to write code is to make software to scratch their own itch. What Desktop Linux needs is for someone to stand up and get some of the best pieces of software to work together optimally, so that a unified operating system can happen. This is where the need for good leadership comes in, and Mark Shuttleworth is doing a good job getting the bits to work together better.
As for the meta release cycle, Ubuntu isn’t doing any arm twisting. You imply that Ubuntu is demanding that other distributions conform to it’s release plans, but Mark has said that he is willing to delay the next Ubuntu LTS in order to find consensus on a meta-cycle with other distributions. To quote Mark “The LTS will be either 10.04 or 10.10 – based on the conversation that is going on right now between Debian and Ubuntu. So we’ve given up some control when our LTS will be,…” Source: http://derstandard.at/fs/1246541995003/Interview-Shuttleworth-about-GNOME-30—Whats-good-whats-missing-what-needs-work
Also, Debian can delay all it wants for stability and still keep with the agreed upon schedule, because Ubuntu and Debian are agreeing on a freeze date, not a release date. Debian has never had a problem picking which version of the Linux Kernel to ship, but takes it’s sweet time polishing it. A freeze date is just a day when the final versions of important applications are set in stone, so that different distributions can work together on patching and testing the software that will be included in their next release. This also helps upstream developers to coordinate with distributions to decide which version of their software will be shipped. If upstreams know when this meta-freeze will be, they can make sure that the best version of their software is ready for it. This will go a long way in helping to create a unified Desktop software stack.