I must apologize because this post is both something of a history lesson and something of a crusade. But it’s a post with a very, very important point about now being the time to start to take serious notice of what and where genuinely important standards are impacting your enterprise as part of the current generation of new technologies and new business requirements.
The evolutionary path of the new generation of technology based on services consumed from clouds started with the Internet and the use of standards to be able to ensure universal connectivity. The World Wide Web followed by extending this principle to provide universal content capabilities, and these two building blocks, together, transformed the understanding of what technology could do, and how it could be used. For the first time a genuinely open external environment in which people could find and exchange content emerged around a relatively simple set of standards using a very different approach to the application-centric proprietary internal use of IT to automate business process.
The new Internet/Web model became the basis for further development leading to Web 2.0, again promoting simple standards for accessibility that could support a growing range of services through which people could gain the new capability for human-centric universal interaction. The important point with both ‘The Web’ and ‘Web 2.0’ was the manner in which standards allowed ‘any’ to ‘any’, or ‘any’ to ‘many’, even ‘many’ to ‘many’ activities to be supported on demand with no pre-planned integration being required. This is a radical departure from the client-server application generation of internal IT where the number of users of an application, and its relationship to every other system must be established.
Client-server IT is based on a close-coupled, state-full, deterministic architecture, which is a way of explaining that the coupling between all elements must be pre-arranged to overcome incompatibilities. The state refers to being data-centric and the need for ‘one version of the truth’ i.e. the single data base that ERP introduced as an example. And deterministic means that the number of users, the amounts of processing power, or storage, are all determined in advance and allocated. The contrast with the Internet/Web and the true cloud model could not be greater as every point is reversed.
The architectural model is loose-coupled, stateless, and non-deterministic, meaning there is no pre-determined coupling. Instead, standards ensure that each system and service can be invoked as and when needed (think of Web browsing across the entire content available as an example). Stateless refers to there being no data-centricity to be maintained. Instead, browsers obtain ‘Representational State Transfers’, or REST, sessions during which they have an image to use (think of Google Maps and compare this to a client-server geospatial application).
This point is hugely important in supporting the ability to scale way beyond anything that a conventional data base can support, and in terms of security of data. As an example, in the USA and elsewhere, this has allowed shared medical information systems to be implemented because there is no exchange of the actual data required. Thus the law on data privacy is maintained. Finally, the non-deterministic element refers to the ability to support as many users and services as are required through scale-up and scale-down on demand. This last point lies behind the ever-increasing use of Open Source as traditional application licensing agreements simply cannot support this model.
But why the term ‘true clouds’? It was chosen as a name for the evolution of the Internet/Web architecture towards its next stage: the capability to support universal shared processes due to the characteristic of the loose-coupled, stateless, non-deterministic architecture being unable to be represented by an enterprise architecture diagram. Whereas in close-coupled, state-full, deterministic, it is possible to detail exactly what is connected to what, and for what purpose, using which protocols, etc, etc, this is simply impossible with loose-coupled, stateless, non-deterministic architecture where a diagram can only possibly show which systems are connected to which clouds and offer what services.
By definition a client-server architecture can never be a true cloud-based architecture, but using technology from the Internet/Web/services generation such as virtualization to improve the operational efficiency of client-server is of great value, and offers a range of new benefits. This is often referred to as ‘cloud’. The other issue is that almost by definition a true cloud will be operating around ‘services’ based on standards to support the ‘any’ to ‘any’ and ‘on demand’ elements of the architecture.
‘Services’ is the latest evolution to introduce universal shared processes, an element currently conspicuously lacking on the Web when various ways of using forms with the content model are the only possibility. In much the same way as content is placed on a Web server for consumption, services are made available together with an orchestration which determines what services, from where, should be assembled on demand, to produce a cohesive, and complex process that may extend across several enterprises.
The ability to add universal shared processes into the external inter-enterprise space marks a significant revolution in business commerce, and this has led to a new generation of standards that go beyond just technology issues being formulated, as well as discussion at the levels of the World Economic Forum. A selection of some important technology standards are shown below:
- The World Economic Forum has recently conducted a survey and published a number of publications on the topic of cloud computing among its many interests, including climate change and economic growth
- Security guideline standards – CSA, Cloud Security Alliance, is a significant standards body which has led in the development of identification, authentication, authorization, encryption and secure access to personal and corporate data and assets
- Grid computing and cloud projects, Globus Project OGSA, Open Grid Services Architecture, OGF, Nebula Cloud (NASA), eScience Open Science Data Cloud, RESERVOIR fp7 project, OpenStack, and many others
- API standards – for access to cloud resources can be split into open and closed (proprietary) APIs:
- Open Data Center Alliance – the standardization of the data center and its capability to support cloud services.
At the beginning of this post I said I wanted to make a point. It is to make sure that you approach what you are doing with clouds, and new front-end technologies, including using hosted services, with a strong understanding that this is all about building a consistent new environment around key standards to do business with others, not as with existing IT around an internal environment where many organizations have decided to make their own ‘standards’ about what, and how, they do things.