Ubuntu 9.10 Boot Performance, and Does it Matter?
Improving boot time has been a focus of Ubuntu developers in recent releases, with the goal of a ten-second startup set for Ubuntu 10.4. To test progress thus far, I compared boot performance for Ubuntu 8.04, 9.04 and 9.10. Below are the results, which demonstrate the impressive strides that have been made thus far towards a faster boot.
Tests were completed using the i386 builds of Ubuntu 8.04.2, 9.04 and the second alpha of 9.10, which was released yesterday. Each operating system was installed with the default options to a virtual disk image, and was run in KVM, using identical virtualized hardware for each test. Boot time was measured from the point at which grub initialized the kernel to the appearance of the GDM login screen.
As a disclaimer, I will mention that the boot times recorded below may be a bit faster than those that can be expected on normal desktops and laptops, since my KVM testing environment runs on server-class hardware, with multiple processor cores and high-speed disks powering the virtual machines. Nonetheless, the tests provide a standardized basis for comparing boot performance between Ubuntu releases.
The numbers
The averages of several tests gave the following results for boot time, in seconds, between the three Ubuntu releases:
- 9.10 (Karmic) alpha: 12.94
- 9.04 (Jaunty): 17.06
- 8.04.2 (Hardy): 25.07
The numbers speak for themselves. Ubuntu’s boot time has improved enormously since the release of 8.04 in April 2008, with the Karmic alpha build starting up almost twice as fast as Hardy.
This progress can be attributed in part to the use of the faster ext4 file system as the default in Karmic. But since Jaunty, which uses the older ext3, also demonstrates significant improvement over Hardy, it’s clear that there’s more to the advancement than file systems alone. Ubuntu developers’ focus on optimizing the boot progress, and the replacement of the ancient System V init daemon with upstart, have also been major contributors to the performance increases.
Does it matter?
My inital reaction to discussion about Ubuntu boot time is to wonder who really cares these days how fast a computer can start up. Personally, I shut down and reboot my computers very rarely, preferring instead to suspend or hibernate them when idle. I get the sense that most other computer users do the same, if they power down their machines at all when they walk away (which they should, because electricity doesn’t grow on trees).
That said, I’m sure there are some users–like those whose machines still won’t suspend properly–who have to reboot their computers often, and who benefit considerably from faster boot time. In addition, having the fastest boot of modern operating systems provides Ubuntu with some bragging rights, and demonstrates that its developers take usability seriously.
Follow WorksWithU via Identi.ca, Twitter and RSS (available now) and our newsletter (coming soon).
Shaving boot times down to 10 seconds certainly does create a lot of buzz for Ubuntu, but I think their efforts could be better spent in other areas of development. Booting super fast into a desktop that is in essence no different than before doesn’t make a whole lot of sense to me.
I would argue that my laptop would benefit greatly from the faster boot time. Frequently I hop on the net to check email and news, but get interrupted to take the kids somewhere or make breakfast/lunch.
Usually I power down to save electricity (I’m obsessed with lowering power bills, etc.). If I can start up faster, that means I can get back to my news, email, and whatever.
Quite probably, the fast boot could also benefit servers. Imagine a popular site, especially commerce, whose servers had to be rebooted in some situation. Every second down could cost money.
This is also relevant to servers being booted to scale for extra traffic or business.
Yes, a bit of extreme POV’s, but still relevant. Also, I would LOVE to have my machine boot that much faster versus my friends’ WinPCs. 😉
You should not believe every other computer user has the same behavior as you. I actually don’t know very few people that suspend or hibernate. I really look forward to a 10 seconds desktop.
Not shutting down is an artifact of MS garbage programming (long shutdown time and startup times – which in turn causes lots of bad things: clogged fans and fins, worn out disk drives, disk drive head crashes from power surges, accelerated aging of electronic parts from continual heat, etc. Not to mention excessive waste of power. I always shutdown my linux and mac machines – my wife NEVER shuts down her windows machine – I fix her machine about once a year – mine starts up any minute I want it (many times a day). GO UBUNTU!
Boot time wouldn’t matter at all if they spent their time fixing suspend and hibernate. Whatever happened to Linux’s fabled uptime? Where are Canonical’s priorities?! Two distributions in a row that they focus on boot time?
I guess they realize that every release is going to have major problems that people won’t know how to fix any other way than rebooting. That’s what I do whenever my Wi-Fi randomly stops working since the “upgrade” to Jaunty. We’re on a Windows 95 quality level here.
Want some ideas of things to work on?
http://brainstorm.ubuntu.com/most_popular_ever/
Faster boot times are great, but I think enough effort has been placed by Cannonical on this issue. The times you note for 9.10 are already good enough. Major effort really needs to be placed on “smoothing out” the Ubuntu experience for the ‘average’ user. There is no reason that the netbook market should be offering WinXP as the default OS, other than the fact that I still had to edit the Grub menu.lst just to get some hardware [right-side card reader] to work on my Acer Aspire One. Not acceptable for the average end-user. The point: Big push for “Ubuntu for the masses”. Break the M$ default stranglehold. Force distributors of netbooks/low end PCs to come up with really good reasons NOT to use Ubuntu as a default option (vs. WinXP/Vista/7). “Follow your dreams; you can reach your goals.”
Either boot in 5 seconds, together with really good session restoring after login, or hibernation that works on all machines.
Whereof I prefer fast boot. Take into account all the guys who run dual-boot systems and cross-use partitions.
Suspend is no option because, as you mentioned, electricity does not grow on trees.
As :m) says, suspend/hibernate is no good to you if you’re switching between distributions and/or between linux and Windows (to play games in my case).
And anyway, suspend/hibernate hasn’t worked for me since Feisty!
Chris,
I think you need to take a closer look at how Ubuntu is using upstart. My understanding is that its still using sysV compatibility mode in all available Ubuntu releases dating back to at least Hardy. If so upstart is just running the traditional init runlevel scripts just as init does and services like ssh are still shipping initscripts and not upstart event scripts.
For example, /etc/events.d/rc5 is the upstart event that mimics init runlevel 5 start up. ssh as shipped even in karmic ships an initscript in /etc/init.d/ and not an upstart events file that takes advantage of any upstart specific features.
I don’t think you can attribute speed up to upstart replacing init at all. But feel free to show me I am wrong. Tell me exactly which default Ubuntu services in any release..even Karmic Alpha, that takes advantage of any upstart’s event based design features and is not being run in init compatibility mode. There is no boot speed benefit to anything run via upstart’s init compatibility mode.
-jef
Does it Matter ?
For people with netbooks Yeah, sure… why not?
But I’m still waiting for that “visual refresh” they promised long ago ? A fresh install looks like diapers gone wrong.
I really love the faster startup time, but I would rather see some work being done on what’s going on after the login screen, thats where I actually sit and wait for the computer to start.
Also the efficiency of applications people keep running, my Firefox I currently use with 5 tabs open uses 10-20% CPU, what’s up with that? It’s also to slow to start.
And by the way, Christiaan Huygens description of the Ubuntu look is one of the best I’ve read.
10 sec boot is great for many reasons. I won’t list them all. I just want to remind you that a great percentage of Ubuntu users have dual boot machines with windows/ubuntu (like me). I boot my pc many times a day, because u can play (latest) games or use specific software only in windows. But when u want security in order to i.e. surf the web or check you bank accounts you need to boot ubuntu. In my case I i need to restart my machine at least 10 times per day, so a 10 sec boot is great.
Go UBUNTU!
Boot times definitely matter. No question about it. As the world goes mobile, they don’t want to wait around for laptops and netbooks to boot up.
Side note: Chris, great analysis of Ubuntu boot times from version to version. The web is buzzing about Ubuntu 10.4 boot time goals — but you’re the only person who actually took a look at boot times from one build to the next. Simple but smart. Thank you for doing this for WorksWithU’s readers.
-jp
I look forward to the boot time increase though agree that there seem to be better things to be focused on right now. But a visual refresh is not one of those things. Brown is fine with me and it can easily be changed if you don’t like it.
If they achieve a 10 second boot time with a regular HD, imagine what the boot time will be with a SSD.
It’s all about smartphones, netbooks, small devices you turn on and off often, to save battery, prevent heating, etc…
It’s not for people like me who restart they machine once in a blue moon
Great point by gdp77 about dual-boot pcs, where constant reboots are done throughout the day. If other Linux distro’s are also speeding up then it means that we can easily have different distro’s for different purposes. For example, Opensuse for Mono development, Windows for games, and Ubuntu for internet and office apps.
I agree that faster boot is very welcome. I can’t confirm you experience with the Karmic Alpha however. I’ve been running it in a VBox VM and it is much, much slower that Jaunty. I’m sure that will change with later releases. Interestingly, this is first Ubuntu Alpha that has taken the VBox GuestAdditions without hacking.
Apart from boot time, I think parallel work should be done with gdm and gnome (and xfce). It is getting so that booting the OS is quicker than logging into the graphical desktop! I usually use e17 which currently puts the others to shame in terms of the login to desktop delay.
It’s all about scalability. For modern desktops, which might be “on” all the time, this is not a huge issue. But on the other ends it matter a lot:
1. Old hardware: 40 seconds is a lot better than 5 minutes
2. Netbooks and notebooks
3. Servers, you almost never reboot, but when you do, you want to be up in no time
4. Embedded
Certainly, 2 and 4 are markets where Ubuntu can make money, and 1 and 3 help keep a large mindshare. I think it’s a good call.
I meant to say “Markets where Canonical can make money” (not “Ubuntu”). Also, fast boot times are a Godsend when/if suspend/resume are not working properly.
Look, nobody is going to complain that Ubuntu boots too fast. That would be absurd. My concern is – is this all we can expect from future releases, faster boot times? Even if Ubuntu boots instantly, that won’t really bring new users into the camp. Not by a long shot. And that’s the issue: if Ubuntu is to keep the excitement alive, it needs to address much much more than just boot speed.
@Walt: As for laptops – and I’m typing on one now – during the day I just suspend to RAM. The power it draws in suspend mode is so minimal. Honestly, with all the HDD activity for a cold start, it probably saves power.
@ooboo: was that directed at me? I am sure this effort takes some of the resources at canonical and the larger Ubuntu community. But I don’t think more than 10% of the human power for the next release will go into this. I think it’s just a very visible project. How many devs do you need to profile/test a faster boot?
BTW: thanks a lot, Christopher, for running the numbers! And thanks for the caveat about the hardware.
Incidentally, I always wondered: when they say “10 seconds”, do they specify some reference hardware? Not only processor speed, but disk speed is also a huge factor, some of these shinny SSD’s out there can even make Vista boot in 10 seconds (of course I am kidding but you get the point 😉 )
My vote goes towards *stability* and *performance*. Jaunty was a amp;%amp;/%amp;%$ dog (!) for many of us with laptops using Intel chipsets. I personally don’t care if they shave another few seconds off — it’s plenty fast enough already!
What really matters, especially for the adoption by businesses and Windows users wanting to try something else, is how well it runs. That should always be a priority number one.
Well, yes, the screw up with the Intel drivers has been memorable. ATI drivers were really problematic this time, but only the binary drivers, so at least the official ones were ok. But this is more of a matter of decision making, not resource allocation. Ubuntu should not embrace a radical change if it breaks down on people’s faces after the upgrade IMHO.
I personally love these huge improvements in boot time and general speed/snappiness of the system, but I will have to agree on the people stating that more effort should now be put in different areas of the system. There are, after all, plenty of areas where things can be improved, whether it comes to user experience, hardware compatibility or things simply working as they should.
Go cut the boot times a bit more if you like; I myself would appreciate it a lot. But if you have more pressing issues next to boot time, please make sure not to neglect these issues!
I forgot to mention that personally I’d like to see improvements of avoiding major screwups like the Intel driver case (just keep using the old, tested and proven driver/settings until you’re sure it really works!) as well as improved support for hardware suspension – that’s almost never worked for me.
@DaVince:
It may not be so easy sometimes to hold back and keep using old versions of video drivers any longer. X and the kernel are becoming more highly linked. Moving to a newer kernel to gain improvements in some areas may mean taking new video drivers as well…and also a new X subsystem. You can actually create more instability by trying to hold some pieces back while moving some pieces forward because that particular combination of old and new code isn’t something that upstream developers are actively testing. Is Ubuntu’s conservative approach to exposing users to new tech actually part of the problem here?
http://cworth.org/intel/driver_stability/
-jef
@Jef: Thanks for pointing that out. I think you are right, in this case the issue couldn’t have been avoided by the Ubuntu folks. It seems as if the Intel devs are doing excellent things for Linux opensource graphics (which I truly appreciate), and they just released them a little too early!
@Leo: With all due respect, please don’t sit there and tell me that the issues couldn’t have been avoided by the Ubuntu folks. They *choose* what to include in the final build. New kernel versions are about as exciting to me as watching grass grow. Breaking my video compatibility (when up ’till then it was working just fine), on the other hand, really does grab my attention – in a bad way.
This is by now means a first-offence. Ubuntu has a long-standing tradition of releasing too close to the edge. In Gutsy, compiz on Intel video sort-of worked, but there were issues with video playback. In Hardy, can you say “PulseAudio”? Now we have these Intel video issues in Jaunty. (Now I’m just waiting to see what happens with ext4 as default in Karmic.)
Is it honestly worth smearing Ubuntu’s reputation, is it worth all the bad press, bad forum comments, bad blog posts, and countless bad experiences that many first-time users will have of Jaunty (and who may now never consider Ubuntu again), just to have the latest kernel and X versions? Sometimes I think Ubuntu is it’s own worst enemy. It doesn’t need Microsoft to take shots at it. It can do that all by itself, thank-you very much.
If they could pull their heads out of their compilers for a minute, maybe they’d get smart and build a Team of individuals who could try to head-off this fiascos before they happen – a sort of ‘internal faisco police’. In any way, we need someone at the helm who knows when to hit the ‘stop’ button, and pull back to code that is more ‘tried and tested’, rather than playing it so ‘fast and loose’ – at the users’ expense.
ooboo:
The only team of people who have the perspective and knowledge to know what the problematic codebase interactions will be are the people who are actively involved in the upstream project development of those codebases.
Let me ask you a question, how involved are core Ubuntu developers in the upstream project development for the components you are talking about specifically. The kernel, Xorg, pulseaudio. How many core Ubuntu devs are active and significant contributors to to the development of the upstream codebases that really matter to you? Video drivers matter to you, who in the Ubuntu core development community is the superstar upstream contributor working on video drivers? Does Ubuntu have people who are making signficant contributions to these low level components and have an understanding of the feature roadmapping that is going on in terms of hardware support?
The open source development model is not a neat and tidy linear progression. There are a lot of moving pieces and interactions. None of the upstream projects promise a regression free experience as they work to enhance individual codebases. None of the upstream projects are beholden to you or to any other user of Ubuntu or any other linux distribution. And Canonical surely doesn’t have the manpower on their own to enforce a zero regression experience on your behalf while still staying relevant for newer hardware and capabilities. The best chance at regression prevention is for Canonical devs to be as proactive as possible in upstream development instead of waiting for other people to find and solve potential problems for Ubuntu. Canonical can’t lead as an integrator if all they do is wait for all the integration problems to be fixed by others.
With all due respect, Ubuntu decides how far they wish to swim downstream. Sure, there are benefits and new features to be had from using the latest releases, but it comes with the risk of undiscovered and unresolved bugs. It does no good to be ‘feature-crazy’.
And so what if there are a lot of moving parts, of which Ubuntu has little or no control over. Still, they have the power to veto changes to the system. They still have the power to decide how far and how fast to reach into uncharted territory. And they can do so with reckless abandon, (as it seems they have been), or they can be more conservative, focus on testing, and keep their users happy.
See, it’s not about delivering technology, it’s about delivering benefits. Software that is new has the promise of benefits, but it does not always deliver on this promise.
Using the argument that it’s a problem they inherit from upstream seems pointless to me. It’s like saying you want to build a car, and you decided to use the latest Goodyear tires which have a 50% failure rate, even though you could have used last year’s Goodyear tires instead, which have a 0.01 % failure rate. It just makes no sense to me to release that close to the bleeding edge when all it does is damage the Ubuntu brand and raise all these fiascos.
Show me how many people actually benefited from Ubuntu’s decision to use the latest kernal version and latest X. OK, so two people got to use their Wapcom tablets on Jaunty. Woo hoo! Now show me how many first time users will never consider using Ubuntu again (and will tell their friends as much) because of problems with Jaunty on Intel graphics. Do you see my point?
Can’t they just make a development branch, and not release it until it has been adequately tested? What’s so wrong with that?
ooboo:
Your car tire analogy falls flat for a several reasons.
First: to make a determination about failure rate you must have the manpower to do the failure testing in real world scenarios. Few upstream open source projects don’t have that manpower to do the comprehensive testing. This sort of empirically gained knowledge of real world failures is by and large done at distributions level.
Second GoodYear has contractual obligations to fulfill when providing tires to car manufacturers. GoodYear is undoubtably exposed to liability with regard to manufacturing defects that they cannot avoid as part of that contractual relationship. That contractual liability is a strong incentive for them to sell products which work. Upstream open source projects have no such liability concern as they do not “sell” consumable products to the distributors.
But on to more specific points…
“Show me how many people actually benefited from Ubuntu’s decision to use the latest kernal version and latest X”
Have you taken any time to actually look at the bugfix churn that happens between each release of just the upstream kernel? How exactly do you quantify the impact of choosing to not incorporate all those fixes? The are real bugs, with real fixes. How do you weigh those fixes against potential regressions that you don’t know about yet? Trying to cherry-pick individual fixes out of the rate of churn requires a level of manpower that Canonical simply does not have. That’s exactly the sort of mind numbing backporting work Red Hat has built a business around charging customers for in RHEL.
There is absolutely no way Canonical is going to be able to make that kind of manpower commitment on behalf of several million Ubuntu users, every six months, who are not paying Canonical a dime for support. It’s a completely unrealistic expectation. Stop and think for a second about who is actually paying Canonical. Its the OEMs who are paying..paying for engineering services to get new hardware supported for pre-installed Ubuntu deployments. Users are not paying Canonical for regression-proofing for self-installed Ubuntu…that is not Canonical’s business. Stop expecting a zero regression experience, and start a user experience process which plans on having to mitigate regressions as they appear.
Even so, hiding from the rate of upstream churn, by being overly conservative only works in the immediate short term. But in the long term, on multiple release timescales… all it does it put you further and further behind on the number of integration issues that will have to be dealt with. The only way to minimize disruption at the distribution level is to be as engaged as possible with upstream to drive real world integration issues into the active development work. Someone inside the Ubuntu needs to step up and be active in upstream development in all the critically important projects.
“Now show me how many first time users will never consider using Ubuntu again (and will tell their friends as much) because of problems with Jaunty on Intel graphics. Do you see my point?”
This assumes that I share the belief that aggressively pushing Ubuntu adoption to others is a good thing. I do not share that belief. So it would be disingenuous for me to answer the question in the spirit it was asked. But I will say that whatever mistakes Ubuntu devs made in judgement with regard to the intel issue..can only be prevented in the future for other similar situations if Ubuntu grows manpower who are actively involved in upstream development..and not just consuming that development. You can only prevent regressions by getting ahead of them as part of upstream development work. Sitting on the sidelines waiting for upstream to deal with the problems your userbase may have..is not going to see those problems solved.
“Can’t they just make a development branch, and not release it until it has been adequately tested? What’s so wrong with that?”
Can they? I don’t know. It really comes down to prioritizes and available manpower..and Canonical’s business model. It looks to me like the PPA concept is blossoming into a very diverse set of overlapping development branches for Ubuntu. But even if the tangled PPA-scape ends up being usaable for more indepth regression testing…enough people have to actually use it and report back or regressions still won’t be found.
-jef
And people ask me why I bailed out on Ubuntu and went to Arch: It’s because Canonical focus on meaningless crap like boot time and shiny notification daemons. 9.04 may have been decent, but 8.10 was NOT a good release because Canonical wasted time on features and crap like this and less on bugs.
This is why Ubuntu may be popular for n00bs, but will never really win over the seasoned Linux users.
Oh, and I think a fixed six-month release cycle is a mistake. I think they should do a rolling release (A la arch.) or a “Next release will come ‘when its ready.'” a la Debian.
@Jef: First of all, let me say I’m enjoying the lively discussion. 🙂
Despite the points where we seem to clash, I am seeing there are some key areas where we appear to be in agreement.
I started this thread by basically saying that Ubuntu needs to head-off these fiascos before they happen – ideally, by catching them before they are included in the official releases.
Now, you are saying Ubuntu would benefit from getting more of it’s people involved in upstream development, particularly in key areas of the system, which are more prone to creating critical problems.
gt; Someone inside the Ubuntu needs to step up and be active in upstream development in all the critically important projects … whatever mistakes Ubuntu devs made in judgement with regard to the intel issue..can only be prevented in the future for other similar situations if Ubuntu grows manpower who are actively involved in upstream development..and not just consuming that development … Someone inside the Ubuntu needs to step up and be active in upstream development in all the critically important projects.
Look, you’ve made a good argument for greater Ubuntu participation in upstream development, thereby taking a more proactive stance against bugs and compatibility issues. I don’t dissagree. Aren’t we just splitting hairs here? We both seem to be saying that more Quality Assurance needs to happen. You’re suggesting a specific way of implimenting that, whereas I’m just taking a more general approach in saying we need to produce a less buggy end-product, and I don’t care how, just get it done.
Personally, I don’t care which part of the ‘stream’ gets the increase in attention, just so long as it gets done before that code is put ‘into the wild’ in an official release. Otherwise we are making end-users into unsuspecting beta testers, and I don’t think that’s right. I also don’t think you should make a product, and give everyone the impression that it’s ‘production-ready’, when you very well know it isn’t – or more accurately, when you have so very little proof that it is.
What is also at hand is the difference between ‘image’ and ‘reality’. The image that Ubuntu is trying to present is of a polished, stable, and useable alternative OS. The reality, is that Ubuntu (appearantly) exercises very little control over the quality of what they release, and instead relies too heavily on the end-users of its official releases to uncover and push through the sort of fixes that should have occured well before the official release took place. I don’t know how long Ubuntu thinks it can keep up this deception. Users are getting wise to them. Personally, I’ve already taken a “wait and see” approach to all new Ubuntu releases. I’ll never install a new release on one of my computers until it’s had at least a month of testing and feedback from the user base. Maybe Ubuntu should call it’s current .04 and .10 releases ‘development releases’, and push back the official public release to .05 and .11.
The “reference machine” that Canonical uses for boot time testing is a Dell Mini 9 (with an Atom CPU amp; SSD disk).
And it’s true that until now the speedups were mostly not from using upstart (that might change in karmic though).
See: https://wiki.ubuntu.com/FoundationsTeam/BootPerformance
I am looking forward to faster boottimes…you are on the exact right track. when windows first come out its boottime was good..and i figured with each new release it would get faster. WRONG it got slower..then moved onto the operating sytem. BTW I think alot people do not know how to set there cmos or pick a board..cause about half that bootup time is issued due to cmos. I am booting on nvidia board in 13 seconds on 9.04 using a 80 gig western digital sata 1 and 3800 amd with 1 gig ram. I will make a video later showing my system board and all and me pushing button from button up to using firefox. windows 7. boots in lil over 35 seconds.
Fast boot times matter but not as much as the overall speed, responsiveness and stability of the system. Light distros such as Puppy Linux and Antix MEPIS still use antiquated kernels but they perform more snappily than Ubuntu.
As a college student, I can say that fast boot and shutdown speeds are major concerns when using netbooks (I don’t have one, but see them often). If your next class is in 10 minutes and you have to go up 5 floors or to another building, waiting for your netbook to shutdown so you can safely store it or waiting for it to boot after you arrive late at your next class are not options. I hope Ubuntu can corner the netbook market…its cheap, effective, lightweight, and very stable, perfect for the light processing power of the netbook. Also, most college students have some level of computer knowledge, and using Ubuntu, I feel, is a good way to exercise your inner geek. Linux lets it all hang out, you understand how your computer works more when running linux when compared to Windows…where things like the registry, system32 folder, and msconfig can scare people pretty easily because they are not understood and can be quite dangerous to mess with. Boot performance can also lead to vast improvements in handheld PC’s, smartphones, mp3 players, and all kinds of imbedded devices. Speed is good, maybe great, but it should not steal too much of the spotlight.
Boot times do matter. Most desktop and laptop users (or even server users, like yourself) may not care about it, but it matters because ubuntu is trying (hard) to take over the netbook market. It matters a lot on those low power machines. I have an older laptop and Ubuntu 8.10 actually takes about 70 seconds to boot up. So it matters for people with older hardware too. The other reason it matters is that with time, software bloat increases, and someone has to do something about it. If the developers didn’t do anything, the newest version of ubuntu might actually take 70 seconds to boot on your server class hardware, and I’m sure you wouldn’t want to use it then because it would be too “slow”.
Question: If boot-up was as quick as sleep/hibernate, or even somewhere close, and would save you electricity (promising longer battery life), would you suspend your computer? For the vast majority of cases, I wouldn’t: the only times I need to sleep is when I need to continue what I’m currently doing elsewhere. When I’m done with my task, I shut down: I suspect many other people would as well, if booting up was continually getting faster (5 seconds, anyone?).
Ok, just for fun..
This weekend I tested Moblin 2.1 on my Aspire One 110
Woow great boot time.
Just to experiment I extracted (using dubble loop mount on live image) the Moblin kernel and kernel modules.
Then I did put those into my Ubuntu 9.10 install.
Boom.. pretty cool, now I have a super fast booting Ubuntu 9.10.
The trick is in the initrd not being used by the Moblin 2.1 kernel.
[…] #split {}#single {}#splitalign {margin-left: auto; margin-right: auto;}#singlealign {margin-left: auto; margin-right: auto;}#splittitlebox {text-align: center;}#singletitlebox {text-align: center;}.linkboxtext {line-height: 1.4em;}.linkboxcontainer {padding: 7px 7px 7px 7px;background-color:#eeeeee;border-color:#000000;border-width:0px; border-style:solid;}.linkboxdisplay {padding: 7px 7px 7px 7px;}.linkboxdisplay td {text-align: center;}.linkboxdisplay a:link {text-decoration: none;}.linkboxdisplay a:hover {text-decoration: underline;} function opensplitdropdown() { document.getElementById(#039;splittablelinks#039;).style.display = #039;#039;; document.getElementById(#039;splitmouse#039;).style.display = #039;none#039;; var titleincell = document.getElementById(#039;titleincell#039;).value; if (titleincell == #039;yes#039;) {document.getElementById(#039;splittitletext#039;).style.display = #039;none#039;;} } function closesplitdropdown() { document.getElementById(#039;splittablelinks#039;).style.display = #039;none#039;; document.getElementById(#039;splitmouse#039;).style.display = #039;#039;; var titleincell = document.getElementById(#039;titleincell#039;).value; if (titleincell == #039;yes#039;) {document.getElementById(#039;splittitletext#039;).style.display = #039;#039;;} } Ubuntu 9.10 Boot Performance, and Does it Matter? […]