Seeing through the cloud

I returned just last Friday from a three-day junket to Banff, where I attended an information technology conference about something called "cyberinfratstructure" That word may be as difficult to type as it is to pronounce, but it has a specific and justif

I returned just last Friday from a three-day junket to Banff, where I attended an information technology conference about something called “cyberinfratstructure”

That word may be as difficult to type as it is to pronounce, but it has a specific and justifiable meaning in the world of scientific computing.

The term was actually coined about six years ago by the National Science Foundation in the United States, as a way of talking inclusively about co-ordinating information networks, computers and people in the service of high-end, data-heavy scientific research.

A lot of “big science” these days – particularly astronomy, climatology, and genetics – depends on the ability to assemble, store, process and share databases of sizes so tremendous as to be unimaginable a decade or so ago, but which have now become basic tools of the trade.

Canada, like most other developed technological nations, has invested heavily in the computer hardware and networking infrastructure to make this kind of science possible, but both demand and operating costs continue to rise at rates that often threaten to make the whole enterprise shaky.

A response to these spiraling costs and demands has been the development of something called “cloud computing.”

This, unfortunately, has become buzzword of the moment in the IT world, and as with most buzz words, it has come to be used in so many senses that it has ceased to have any real sense at all.

One of the most interesting events at the Banff conference was a panel conversation featuring four experts in “cloud” computing, and rather ineptly titled “Where Does the Cloud Sit Today?”

Lousy as the title was, the panelists at the event were lucid, intelligent people who had pithy and pertinent things to say. As well, the comments and questions from the audience participants were also at a very high level of expertise (so much so that I kept the quality high by keeping my own mouth shut).

The general consensus of those present was that, to be meaningful, the terms “computer cloud” and “cloud computing” probably require more rigorous definition than they have received so far, and probably need to be distinguished from each other.

The most succinct definition of “cloud computing” came from one of the members of the audience.

This woman suggested that “cloud computing” should be understood to refer to high-intensity, network-distributed data computation, in which a number of synchronized computers, located at greater or lesser geographical distances from each other, combine forces to form a virtual machine, and to act like one enormously powerful computer.

High network bandwidth may play a role in this kind of activity, but bandwidth is not in itself a key consideration of cloud computing; it is really about synchronizing cycles on computers and pooling results.

The advantages of this kind of computing to computerized research are obvious: No one university or research institute needs to supply all of its own computing power to meet its needs; it can draw on the computing capacity of other organizations, and contribute its own capacity to the pool when it is not needed locally.

Construed in this sense, cloud computing is a real, existing service, with measurable value and proven results.

It is when the term moves out of the academic and institutional environment that its meaning and value begins to become, well, hazy.

It is common now for companies in the IT sector to present their goods or services as cloud-based or cloud-ready.

Low-powered portable computing devices like smart phones and the IPod Touch, for instance, are often spoken of as cloud-based technologies, because they rely on the workings of network-serving computers for most of their functionality.

Similarly, internet services like Google Aps are touted as being “cloud” computing services because, once again, a bank of network servers does most of the actual computer work, while the computer or portable device at the other end does little more than send out orders and receive results on its web browser.

In truth, though, neither of those activities really qualifies as cloud computing.

In the case of cellphones and iPod Touches, what is going on is really nothing more than the old-fashioned smart-server/dumb-client computing that the oldsters among us recognize as “mainframe” computing (ala the heyday of IBM) made new.

Similarly, a company availing itself of the online spreadsheet and word processing programs of Google Aps is not really doing cloud computing (though Google itself certainly is, to make the service work); it is simply doing plain old application outsourcing, which, again, is pretty old-hat stuff.

It might be meaningful to talk about these kinds of machines and services as being part of a computer cloud, in the sense that they depend on a collection of many different serving computers to make them function.

But that does not make them examples of cloud computing itself.

It is better to try to maintain that clear logical distinction, not only for the purpose of intellectual clarity, but for another reason, too: To prevent genuine cloud computing from also going into the trash basket when the current, hyped-up talk about computer clouds and cloud computing finally falls into the contempt and dismissal it so roundly deserves.

Rick Steele is a technology junkie

who lives in Whitehorse.