Cloud computing is the delivery of on-demand computing services — from applications to storage and processing power — typically over the internet on a pay-as-you-go basis.
Rather than owning their own computing infrastructure, companies can rent access to anything from applications or servers from a cloud service provider. Providers can benefit from significant economies of scale by providing the same services to a wide range of customers. One benefit of using cloud computing services is that firms can avoid the upfront cost and complexity of owning and maintaining their own IT infrastructure, and instead simply pay for what they use.
A fundamental concept behind cloud computing is that the location of the service, and many of the details such as the hardware or operating system it is running on, are often largely irrelevant to the user (although this is not always the case in practice). It’s with this in mind that the metaphor of the cloud was borrowed from old network schematics, in which the public telephone network (and later the internet) was often represented as a cloud to denote that the underlying technologies were irrelevant.
Cloud computing as a term has been around since the early 2000s, but the concept of computing-as-a-service has been around for much, much longer — as far back as the 1960s, when computer bureaus would allow companies to rent time on a mainframe, rather than have to buy one themselves.
These ‘time-sharing’ services were largely overtaken by the rise of the PC, and then the rise of corporate data centers where companies would store vast amounts of data. But the concept of renting access to computing power has resurfaced a number of times since then — in the application service providers, utility computing, and grid computing of the late 1990s and early 2000s. This was followed by cloud computing, which really took hold with the emergence of software as a service and hyperscale cloud computing providers such as Amazon Web Services.