CTO Blog

CTO Blog

Opinions expressed on this blog reflect the writer’s views and not the position of the Capgemini Group

The ‘Internet of Things’ meets the ‘Web of Services’!

Category :

You will remember how anything and everything was going to be connected and reporting back to a user, even if it was just a low level on/off alarm to say the temperature was rising above a preset threshold in an area of a cold store? To do this we require low cost and low power networking, and the well proven and accepted Bluetooth Special Interest Group, or SIG, brought this a step further forward on July 6th. This was the date that the Bluetooth Core Specification Version 4 was formally adopted to bring in an ultra low power version with the aim of being ‘a significant contributor to the overall wireless sensor market’, with the explanation that ‘the advantage to this new protocol is that it is totally optimised for low power battery operation’. In this context, low power means power that will run for years on a standard coin cell battery, so almost anything could now be shipped fully enabled to be ‘discovered’ and linked up to other devices using the well established Bluetooth approach. There is, however a big difference from being able to supply data en masse, and being able to use data en masse usefully. That’s where the recent shifts in real-time data analysis come into play. By definition sensor networks are about reporting real-time events so there needs to be the capability to carry out real-time analysis. In-Memory data handling has also been a popular topic in business intelligence circles over the last year or so. However like snake oil it is full of promise but a little difficult to pin down in reality. In theory it’s possible to ‘dump’ data into a huge In-Memory holding cache that can be cut and diced using the power now available through ‘virtualised’ data centres. If you put this together with the mass amounts of small data signals from sensor networks then it looks like a paradigm shift. I.e. sensor network signals have been difficult to handle as small event data is notoriously inefficient to process on one side, and on the other real-time data analysis is crying out for more inputs to improve the accuracy of the output. However I did say it’s like snake oil, the famed cure-all medicine sold at fairs by conmen in the past, which decidedly failed to live up to the promises made by the seller. Not that I am suggesting that any of the vendors offering In-Memory capabilities are guilty of this trickery, it’s just that it’s remarkably hard to get any real depth on the topic. I was delighted to find Richard Hirsch’s blog on ‘How In-Memory technology will impact process environments; indirect or direct’ posted on SAP Blogs. Richard starts by echoing my views on the difficulty of finding good material on the topic, explaining he decided to investigate and write his own clarification. There is a last part to this change around how to make all of this data and real-time analysis usable by average ‘users’. To quote Greg Chase; ‘if you make applications smarter so that they pre-fill related data and tailor themselves to the specific context of the user and process instance, you’ll make it easier for casual business users to engage with a process’. Back to snake oil? Nope the Australian Government once again proves it can be done as they continue to demonstrate with their ever expanding ‘eCitizen’ programme. It is eCitizen as it puts the citizen, or user, at the centre and I really urge you to go to the web site to see what a wonderful example it is of how we should be thinking about bringing all of this together. Here is the opening paragraph of what is on offer to whet your appetite! This example should really create a new gold standard for thinking about how to design and deliver web services. Report from Business to Government, Instantly. Wouldn't it be great if you could manage your business reporting obligations to various government agencies in a secure, consistent way and from a single place? With GovDirect® you can finally:

  • Use a single secure logon for report submission to all supported government agencies
  • Know what your obligations are and when they are due
  • Complete reports with a few clicks of a button, forget the manual paper forms
  • Validate information within reports before you lodge to government
  • Receive instant notification and electronic receipts of your report lodgements

About the author

Andy Mulholland
Andy Mulholland
Capgemini Global Chief Technology Officer until his retirement in 2012, Andy was a member of the Capgemini Group management board and advised on all aspects of technology-driven market changes, together with being a member of the Policy Board for the British Computer Society. Andy is the author of many white papers, and the co-author three books that have charted the current changes in technology and its use by business starting in 2006 with ‘Mashup Corporations’ detailing how enterprises could make use of Web 2.0 to develop new go to market propositions. This was followed in May 2008 by Mesh Collaboration focussing on the impact of Web 2.0 on the enterprise front office and its working techniques, then in 2010 “Enterprise Cloud Computing: A Strategy Guide for Business and Technology leaders” co-authored with well-known academic Peter Fingar and one of the leading authorities on business process, John Pyke. The book describes the wider business implications of Cloud Computing with the promise of on-demand business innovation. It looks at how businesses trade differently on the web using mash-ups but also the challenges in managing more frequent change through social tools, and what happens when cloud comes into play in fully fledged operations. Andy was voted one of the top 25 most influential CTOs in the world in 2009 by InfoWorld and is grateful to readers of Computing Weekly who voted the Capgemini CTOblog the best Blog for Business Managers and CIOs each year for the last three years.

Leave a comment

Your email address will not be published. Required fields are marked *.