In the early days of computing, devices were big and expensive, and as a result, they were centralized. The user had to go to the computer center to access mainframes via dummy terminals, all controlled by the Management Information Systems (MIS) department, in order to gain access.
Around 1980, Bill Gates gave Microsoft, the company he founded, a clear mission: “A computer on every desk and in every home.” As such, the PC Revolution was born, and computing power was brought directly to the user.
Over time, early Information Technology (IT) departments found the decentralization of data found on all those desks proved difficult to control and manage, so they encouraged storing data on servers, often located in centralized data centers, with access provided through new virtualization platforms to thin clients leveraging Remote Desktop Protocols.
Enter the “cloud” and mobility solutions, such as smart phones and tablets – now users want access to their data from anywhere, and at any time. Data begins to decentralize as users often find they have the ability to leverage new cloud-based software-as-a-service (SaaS) applications by setting up a new account with nothing more than an email address and a password.
Meanwhile, most Line-of-Business applications (LoB) have remained on servers, often powered by legacy database technologies. Enter global tech giants, such as Oracle, who have worked to acquire Line-of-Business application companies and are now centralizing them into their unified, international support infrastructure.
What’s next? If history has proven anything, it’s that we’re standing at the dawn of a new era of decentralization. This pendulum swing would contradict many prognostications of the direction that IT is heading, but those that choose to ignore history tend to find that it will eventually repeat itself.