CTO Blog

CTO Blog

Opinions expressed on this blog reflect the writer’s views and not the position of the Capgemini Group

The open source community delivers cloud stacks

Category :

The trigger for this blog was actually the recent SIENA, (Standards and Interoperability for Einfrastructure implemeNtation initiAtive), conference entitled Cloudscape III which focuses on the science computing community. It was an interesting set of presentations all available online showing the same leadership that in turn has produced the Internet, the World Wide Web and grid computing. Altogether the papers represent probably the clearest view of how clouds and their use will develop as the big scientific centres plough real expertise into solving everyday issues, to make this new level of computational services work.

Included in the presentations is one on NASA Nebula which provides a reference model for interoperable clouds. The OpenNebula project with associated Open Source community developing the stack, is strongly based on leading scientific centres including CERN, the home of the World Wide Web. OpenNebula 2.2 was released in March 2011 as ‘stable for daily operations’ under the Apache License, currently the previous release OpenNebula 2.0.1 can be downloaded as part of Debian, OpenSUSE and Ubuntu Linux distributions.

An alternative that seems to be more driven by commercial users comes from The Cloud.com, www.cloud.com Open Source Cloud Computing who also in March have just released its latest, now significantly comprehensive software stack, CloudStack 2.2. This release not only provides an ‘out of the box’ (as they call it) pre-integrated set of all-you-might-hope-for to build a private cloud, but it also now includes CloudBridge. As the name suggests this bridges between a CloudStack cloud and an Amazon EC2 / S3 and allows services to be shared and interchanged. This is a big part of the roadmap for the ongoing development of the CloudStack Open API, and the goal of a similar seamless service for other ‘cloud’ or ‘virtual’ products. The ambition extends to being able to manage this hybrid environment through common policies and tools enacted at the API connection layer.

While it’s not Open Source, the Open Data Centre Alliance has to be brought into this as the third and very commercial community active in setting a path for being able to deploy clouds. By the way expect to see the term ‘open’ being used more and more simply because the focus now is on external interactions between enterprises using ‘services’ and this by definition means everything has to be ‘open’ to be usable. I have commented on the Open Data Centre Alliance in a previous blog  on Next Generation Data Centres being built to function in a very different way in order to support cloud stacks. In drawing attention to the software stacks I felt that I should draw attention again to the design of the hardware that relates to them.

These three industry moves together create a pretty cohesive approach to the design and functionality of a mature cloud model supporting more than just internal deployment of improved capabilities reflected in a private model. On this point alone it’s worth taking the time to understand exactly what each is doing and most importantly why, but at the same time it’s also necessary to do whatever it makes sense to do today in updating and adopting virtualisation and clouds with a clear idea of where you will need to be in a year or so time. We all know that worrying feeling of being driven to make a decision around new technology in the current year and wondering if it will be a help or hindrance in the years ahead!


A week or two back I posted on the RSA Security Conference and around the time this was going on RSA root confidential systems were attacked in a very very skilful manner and certain confidential information was obtained. RSA, a division of EMC published an online advisory note on the risk that this might pose for RSA SecureID users. It seems as if the attack was aiming at obtaining the root algorithm that generates the random number used in two factor authentication.

About the author

Andy Mulholland
Andy Mulholland
Capgemini Global Chief Technology Officer until his retirement in 2012, Andy was a member of the Capgemini Group management board and advised on all aspects of technology-driven market changes, together with being a member of the Policy Board for the British Computer Society. Andy is the author of many white papers, and the co-author three books that have charted the current changes in technology and its use by business starting in 2006 with ‘Mashup Corporations’ detailing how enterprises could make use of Web 2.0 to develop new go to market propositions. This was followed in May 2008 by Mesh Collaboration focussing on the impact of Web 2.0 on the enterprise front office and its working techniques, then in 2010 “Enterprise Cloud Computing: A Strategy Guide for Business and Technology leaders” co-authored with well-known academic Peter Fingar and one of the leading authorities on business process, John Pyke. The book describes the wider business implications of Cloud Computing with the promise of on-demand business innovation. It looks at how businesses trade differently on the web using mash-ups but also the challenges in managing more frequent change through social tools, and what happens when cloud comes into play in fully fledged operations. Andy was voted one of the top 25 most influential CTOs in the world in 2009 by InfoWorld and is grateful to readers of Computing Weekly who voted the Capgemini CTOblog the best Blog for Business Managers and CIOs each year for the last three years.

Leave a comment

Your email address will not be published. Required fields are marked *.