My first exposure to computers (aptly named at the time) was to a Dec PDP-10 36-bit machine with punch card. Processors were so expensive than my university owned 2 of them, everyone had to share. Storage was expensive, so you kept your programs on punch cards, your data on mag-tapes.
A few years later, timesharing was allowed, and all your programs and data was stored on the mainframe accessed remotely by dumb terminals some terminals were converted printed, others displayed 80x25 characters on a black and white screen (I remember lusting after a Heathkit H-19 with an external 300 baud modem), other had built in graphics capabilities such as the RamTek terminals.
Still all connected over slow RS232 serial links, limiting what can be done. That was 1979.
Then in 1981 IBM introduced the IBM PC 5150 and things changed for a couple of decades. The personal computer ushered in the era of disconnected computing, data & programs resided in a little beige box in your den.
By 1999, the internet had connected most of these machines into a global network usurping the then dominant client-server metaphor.
So here we are a decade later and a new model is about to take hold, a model quite similar to my original PDP-10 experience: Cloud Computing.
Data (now called “content”) and programs (now called “Services) reside in the Internet “Cloud” accessible via a broadband connection from anywhere in the world using a multitude of devices. Granted these devices are smarter, more colorful, and easier to use than their predecessor, but the fact remains: the Cloud, is the biggest mainframe ever made, it’s just happens to be made of a number of smaller components called servers.
The cloud is the ultimate virtualization of computing: it, in effect, commoditizes not only processor, programming languages, storage, but content as well.
In the next installment I will examine the impact this will have on the existing technology landscape and how it’s changing corporate value propositions.