By Eleanor Haas
How quickly both information
technology and social values are advancing! Bringing supercomputing capabilities down to a general-purpose level where
they can serve consumers, not just advanced scientists – as a new IBM product
does – is a giant leap forward in our ability to access, analyze and manage massive
amounts of information efficiently and cost-effectively. But that alone is no longer sufficient. Today
our society requires that this be done in ways that minimize climate change
effects. Greentech is not just a
fad. It’s here to stay.
The new IBM iDataPlex
line of server products does both. It
seems to me to represent an historic break-through: It enables cloud computing that supports Web
2.0 applications as well as the high performance computing (HPC) requirements
of life science researchers, engineers, petroleum exploration, financial
services, and government and academic research. But perhaps the most important attribute of the new hardware design is
that it requires 40 percent less electrical power to run than alternatives with
comparable computing power and can eliminate air conditioning when outfitted
with a water-cooled wall.
As Web sites evolve, they
add features that impose new demands on the infrastructure and challenge
performance standards critical to the user experience. Web 2.0 applications empower users to do a
great deal more than just retrieve information. Now users can activate interactive features inherent in Web 1.0 at a
higher level and exercise new degrees of control over data, even add value to
applications as they use them. These new
capabilities drive enterprises to scale capacity in a secure, reliable and
cost-effective way in order to deliver a satisfactory user experience. Cloud computing is one of the most important
new concepts that is emerging to make this possible.
Cloud computing is
computing done at a remote location, which is to say, out in the clouds. It is computing on a massive scale in terms
of both computing power and the range of computing tasks.
Supercomputers were mainframes
invented to enable advanced scientists to handle enormously complex
calculations. Then Google and others
started locating the data storage and processing power of supercomputers on
vast banks of computer servers in remote data centers – Google called these
distant servers “the cloud” – instead
of on mainframe computers or a network of multiple processors on the Google
campus. And Amazon and others started providing
cloud computing services – remote
computing services, also called web services – delivered over the Internet. And so cloud computing was born.
Cloud computing is hugely important but still nascent. A recent report
by Forrester Research said in
its executive summary that: "Cloud computing is a new IT outsourcing model
that doesn’t yet meet the criteria of enterprise IT and isn’t supported by most
of the key corporate vendors. It’s wildly popular with startups, exactly
fits the way small businesses like to buy things, and has the potential to
completely upend IT as we know it.”
Cloud computing represents a fundamental shift in how we
handle information, according to BusinessWeek, because it enables companies to
write their own programs to run on a cloud provider’s servers. Irving Wladawsky-Berger, Chairman Emeritus,
IBM Academy of Technology and Visiting Professor of Engineering Systems, MIT,
sees two major factors that make cloud computing qualitatively different from
all IT concepts to date: One is massive
scalability. The other is the much
higher quality of experience it can provide for users. “As with the Web in the
mid-‘90s, every enterprise will have to develop its own cloud-like
capabilities, or work closely with partners that do.” he writes on his blog.
But cloud computing can be no more than a vision until
something is done about the data centers on which it relies. In general, today’s data centers can be
described as massive, sprawling and pushing the limits of power and space
available to them. Many have grown
through mergers and acquisitions, with different departments having their own
servers and a proliferation of small and mid-size servers. As a result, they are inefficient, expensive
to operate and have high energy requirements. Worse, they cannot be scaled effectively.
The new IBM iDataPlex system represents a solution, a basis
for the data center of the future. It
reduces the cost per server by approximately 20 percent to 25 per cent by using
off-the-shelf components and open source software, fits 138 percent more
servers in the same floor space and, best of all, as we said, requires 40 percent
less power to run. It is intended both
for enterprise cloud computing initiatives and clouds designed to host Web 2.0
Not a lot of iDataPlex systems will be sold. The target universe contains only 1,000
prospects, each valued at upwards of $20 million, and each system will be
custom-built. But it’s already clear that cloud computing builds on itself, as
large companies become suppliers for smaller ones. And IBM has the pieces in place to help
customers acquire new data centers conveniently. IBM Global Financing will
offer them lending and leasing opportunities, IBM Global Asset Recovery
Services can manage the disposal of equipment in accordance with environmental
regulations, and IBM will team with third party technology companies to drive a
product ecosystem around iDataPlex..
That’s smart of IBM
because iDataPlex has the markings of a hot product. According to Forrester, “Cloud computing . .
. has all the earmarks of a disruptive innovation: It is enterprise technology packaged to best
fit the needs of small businesses and start-ups – not the enterprise.” An eco-friendly cloud computing system. What a thought!
What do you think about all this? Where do you see it going?