Is Less Really More?

The Economist just ran an article on less powerful computers becoming more attractive due to the recession and because more services are available online. This might hurt the computer industry. Well, not entirely true. While a “regular” consumer may indeed find a slower netbook laptop good enough, a corporation putting slower machines on its workers’ desks will have to invest in reliable networking and faster servers to carry the extra computational load. So, computing services will still have to be paid for either through direct hardware investments or by paying service providers such as salesforce.com and Amazon EC2. The right question is whether there is a net savings in IT expenses by investing in faster servers instead of desktops. And the impact on the industry should be viewed in the same way, since having the new arrangement of servers doing more and desktops doing less may give rise to new and unexpected opportunities.

ps: a personal example of where less is indeed more is this dishdrawer. My wife and I hardly used the full-size machine we had before, but we’re finding this tiny appliance fills up quickly enough that we actually use it.

Author: kwanghui

http://kwanghui.com

2 thoughts on “Is Less Really More?”

  1. The Gershon Report (commissioned by Tanner) highlighted the need for consolidation of data centres for environmental reasons, as well as economic. It’s not a bad report (even though I have a few quibbles), and Tanner has promised to implement it in full. Worth a read.

    Some folk are even thinking of using the heat going up huge data centres (such as those of Google) to drive a turbine to recapture some of the energy.

    The “thin client and smart protocol” approach we are moving to with cloud services like gmail was something the unix world was pushing for at least 15 years ago… but that model was set back by Microsoft’s dominance of desktop software, that became increasingly hungry for memory and power, and MS fought the move to thin clients with all it’s marketing might, because it’s “cash cows” were designed to do all the heavy lifting on the desktop rather than have multi-user machines.

    Like

  2. A powerful server with many cheap client machines is alawys going much more efficient than many powerful client machines, if you can pull it off. Several software vendors have pulled it off over the last few years. For example, virtualization works very well today. Some of the efficiencies gained by centralizing computing resources are obvious:

    Typical PCs are outrageously powerful. They can process billions of instructions per second which is far more than the typical PC user doing someword processing and web surfing requires

    Other efficiencies are less obvious

    Total cost of ownership decreases dramatically with centralization because maintaining and upgrading software running in one physical location is far easier than on many people’s personal computers.
    Most client software, even computationally intensive software like high quality graphics, has very low duty cycles. It does nothing most of the time. When you buy expensive PC hardware to support it, you are paying to support peak usage, the few minutes per day when its does the tricky computations and you want the user interface to be responsive. The average computer resource usage of this software is very low, much less than 10%. Therefore running the software on a central server is much more the 10x more efficient.
    Expensive infrastructure such as high-reliability disk storage does not need to be replicated through an organization.

    Virtualization, SAAS etc only became effective in the last few years so many software and hardware vendors built their (then efficient) businesses around powerful client PCs running software locally. These vendors were faced with the innovator’s dilemma over the last few years and companies like Google, VMware and Salesforce.com emerged.

    Like

Comments are closed.