Person holds tablet in front of a cloud

The history of the cloud: a journey through the last 60 years

As internet users, most of us use the cloud every day. Whether we want to download an app, securely store our photos or work on a document with colleagues in real time, the cloud has become an integral part of our daily lives. It is not for nothing that experts consider the technology as backbone of digitalization. However, not everyone knows that the history of the cloud goes back to the 1960s. Join us on a journey through time and learn more about the history of the cloud.

The idea (1961)

John McCarthy was not only an AI trailblazer, but also a pioneer of the cloud. In 1961, he described the concept we today know as cloud computing. In a speech at MIT, the computer scientist explained that computing might be organized as a public utility someday. Like electricity, computing and applications could be offered to the public on demand for a user fee.

According to his prediction, three principles are crucial for the cloud:

  1. Storage capacity, computing, applications, etc. are offered as IT services.
  2. Thanks to virtualization, many users can share the same computer resource.
  3. The access to the services takes place via networking.

First milestones (1960-1970)

An important milestone was the virtualization of operating systems by IBM in 1967. This allowed several users to share the same computer resource at the same time. In the following decades, research on this technology was to make further progress.

In virtualization, an intermediate layer imitates a hardware or software object to create virtual devices or services. For example, a physical server (host) can be divided into several virtual machines (guest) using hypervisor software. The virtual machines then share the resources of the host.

In 1969, the Advanced Research Projects Agency Network (ARPANET), the forerunner of today’s internet was born. This laid the foundation for the cloud. However, the technical feasibility was still a long way off.

Important foundations (1970-1980)

In 1971, Ray Tomlinson sent the first email via ARPANET. The text was slightly less spectacular than this progress in networking. “Something like QWERTYUIOP”, the computer scientist later recalled.

Moreover, the first mass-produced microprocessor – the Intel 4004 – came onto the market in 1971. The development of microchips made computing considerably cheaper and smaller. Computers for everyone suddenly seemed realistic. In the mid-1970s, Microsoft and Apple were founded.

Old computer

More and more households and companies use personal computers

Crucial developments (1980-1990)

In the 1980s, more and more private households and companies started using personal computers. At the same time, the performance of computers improved. Within companies, several computers could be connected to a server via Ethernet, i.e. via cable.

The client-server architecture for distributing tasks and services in a network emerged. In this model, the client, e.g. your computer, requests a service from the server. The server then processes the request and provides the service, e.g. information or resources for a process, to the client.

Internet services such as the World Wide Web (see next paragraph) are based on the same principle. Web servers answer the requests from web browsers on client computers and transmit the contents of the internet pages to the browsers, which display them on the user’s screen.

The rise of the World Wide Web (1990-2000)

Hardly any other achievement represents the rapid progress of the technologies underlying the cloud like the World Wide Web. Tim Berners-Lee laid its foundations in 1989. At that time, the internet counted 100,000 connected computer systems worldwide. By 1992, the number already exceeded a million. The opening for commercial use then led to the internet boom in the second half of the 1990s. E-commerce developed. Amazon, for example, was founded in 1994.

In the mid-1990s, the term cloud computing was used for the first time with its current meaning in a business plan of the computer maker Compaq. However, it was the conceptual predecessor of the cloud that prevailed back then. In grid computing, you connect many different computers via the internet to form a virtual high-performance computer. This allowed to outsource computationally complex processes and, what is more, to sell computing power for the first time.

Fiber optic cable

Fiber optic cables ensure fast (intercontinental) data processing.

Big steps forward (1990-2000)

The 1990s also saw the development of Application Service Providing (ASP), the forerunner of Software as a Service (SaaS). The underlying concept is the same: you rent the software, i.e. the provider makes it available on central computers and takes care of maintenance and updates. However, ASP required considerable resources, for example because you had to host the applications on separate physical machines.

Thanks to the development of the multitenancy software architecture in the late 1990s, it became possible to offer applications with lower operating costs. In this architecture, a single software instance runs on a server and serves several tenants (user groups), who only have access to their own data.

Birth of the modern cloud (2000-2010)

The modern cloud was born in 2002 when Amazon founded a cloud computing platform to ensure the security and stability of its website. The possibility to rent unused storage and computing capacity was a side benefit.

The centralized infrastructure with data centers hosting computing power and storage capacity emerged. In 2005, OpenNebula was initiated. This project resulted in a free software for building and managing cloud systems, which fueled the emergence of private clouds.

In 2006, Amazon made the Elastic Compute Cloud available to the public. The first cloud database services developed. With the foundation of Dropbox in 2007, the concept of storing files in the cloud emerged. New data centers popped up everywhere.

Microsoft announced a cloud computing platform in 2008 and launched Azure in 2010. In the same year, NASA and Rackspace initiated OpenStack. This project offers a free cloud computing architecture and found broad interest.

Development of the modern cloud (since 2010)

The amount of cloud services and the competition between the growing number of providers increased. Given the possibility to monitor the resources, the trust in the cloud grew. Real-time streaming services processing data in the cloud emerged. Non-relational databases allowed to retrieve, process and store large datasets quickly.

The DevOps approach became popular: development and IT operations work together to accelerate the development and improve the product quality. Closely tied to this is the microservice architecture for cloud applications. The components of an application are divided into small modules that interact with each other.

Containers are the most common virtualization technique in the microservice architecture. Correspondingly, it gained momentum when container services became available on the cloud. In 2014, the European Grid Infrastructure launched its Federated Cloud for European scientists. Moreover, the hybrid cloud, which combines the advantages of the private cloud (data protection) and the public cloud (flexibility), emerged.

Internet of Things

Edge and fog computing play an important role in the Internet of Things

Future trends edge and fog computing

The possibility to reduce waiting times by processing user requests outside the cloud attracted the interest of the research community. Edge computing, in which data processing takes place at the edge of the network, developed as an alternative to the centralized cloud architecture.

Fog computing also aims at reducing latencies. Local computing entities, so-called fog nodes, preprocess the data before they are uploaded to the cloud. Edge and fog computing play an important role in the Internet of Things. The cloud alone could not handle the flood of data from billions of devices connected to the internet. Blockchain technology ensures the security and integrity of data in edge computing.

Future trends machine learning und serverless computing

Machine learning in the cloud is becoming increasingly important, for example to predict user preferences. Google developed cloud Tensor Processing Units (TPUs) to accelerate machine learning workloads. Among others, cloud TPUs power products like the translator. Moreover, the hardware in the data centers is becoming more heterogeneous. This is because graphic processors complement the traditionally used processors (i.e. CPUs) to speed up the execution of applications running on the cloud.

In 2018, Microsoft started testing the world’s first subsea data center. Serverless computing is another trend that should not go unmentioned. This means that users only have to pay for the storage capacity and computing power they actually use – and not for idle time.

Advantages of the cloud

The history of the cloud is a success story and cloud-based ERP solutions enjoy great popularity. Given that you neither need your own IT department nor your own hardware, you can start using the solution without large investments. Moreover, IT experts from the provider take care of software updates and of the operation and maintenance of the data centers. The fast implementation of adjustments and changes is another positive aspect.

In addition, you can choose freely from the available modules and only pay for what you actually need. Data access is location-independent, all you need is access to the internet, a browser and a terminal device. The industry solutions agilesFood and agilesTrade are also available on the cloud.

One thing is for sure: The golden age of the cloud has only just begun.

Are you interested in cloud-based ERP solutions or do you have questions about the cloud? We are happy to help.

Contact us!