Other Recent Articles

Cloud Computing Doesn’t Make Sense !!

By arunenigma on Saturday, April 18, 2009 with 0 comments

Clouds, clouds, clouds. Everyone talks about Google-style cloud computing — software as services off in the Internet “cloud” — as the future.

But while cloud computing is a marketing triumph, new research from asserts that trying to adopt the cloud model would be a money-losing mistake for most large corporations. The research is being presented at a symposium on Wednesday afternoon, sponsored by the McKinsey & CompanyUptime Institute, a research and advisory organization that focuses on improving the efficiency of data centers.

The McKinsey study, “Clearing the Air on Cloud Computing,” concludes that outsourcing a typical corporate data center to a cloud service would more than double the cost. Its study uses Amazon.com’s Web service offering as the price of outsourced cloud computing, since its service is the best-known and it publishes its costs. On that basis, according to McKinsey, the total cost of the data center functions would be $366 a month per unit of computing output, compared with $150 a month for the conventional data center.

“The industry has assumed the financial benefits of cloud computing and, in our view, that’s a faulty assumption,” said Will Forrest, a principal at McKinsey, who led the study.

Owning the hardware, McKinsey states, is actually cost-effective for most corporations when the depreciation write-offs for tax purposes are included. And the labor savings from moving to the cloud model has been greatly exaggerated, Mr. Forrest says. The care and feeding of a company’s software, regardless of where it’s hosted, and providing help to users both remain labor-intensive endeavors.

Clouds, Mr. Forrest notes, can make a lot of sense for small and medium-sized companies, typically with revenue of $500 million or less.

Instead of chasing cloudy visions, McKinsey suggests, corporate technology managers should focus mainly on adopting one building-block technology of the cloud model, virtualization. Such virtualization allows server computers to juggle more software tasks, and thus increase utilization, reducing capital and energy costs.

The average server utilization in a data center, according to McKinsey, is 10 percent. That can be fairly easily increased to 18 percent, the consulting firm says, by adopting virtualization software (EMC’s VMware is the leading vendor). With more aggressive adoption programs, servers in corporate data centers can reach up to 35 percent utilization, McKinsey said.

“We should focus on things we know work now, and virtualization works,” said Kenneth Brill, executive director of the Uptime Institute.

Category: cloud computing , current news

POST COMMENT

0 comments:

Post a Comment