Capping IT Off

Capping IT Off

Opinions expressed on this blog reflect the writer’s views and not the position of the Capgemini Group

The Future of Infrastructure - Disruptive Technologies

Can I really ignore infrastructure? Or do I have to still consider it? If so, why?

Over the past couple of weeks I focused a lot on the future of Infrastructure – the Invisible Infostructure (see here). For many invisible means you can ignore Infra – I do not believe that this is, nor that it will be the case.

Let’s draw a parallel with motor racing. There are many examples of great racing drivers who also had to understand exactly how a car functions. For instance Nikki Lauda is famous for being an excellent car mechanic as well as a brilliant driver. If you look back to the start of car racing, most, if not all of the drivers had to be a good mechanic as technology was not very reliable and stable. Look at the famous Silberpfeil in the 1930 (the W125). A massive I8 (straight 8 cylinder) engine with over 5.6ltr ccm, producing upto 650BHP, enough to reach over 300 km/h.
 

Figure 1 : http://www.automoblog.net/2009/07/10/lewis-hamilton-with-the-first-silver-arrow/

Drivers of the W125, like the great Rudolf Caracciola, had to understand the mechanics to ensure the car was being improved and that before each race the car could be prepared. If you look even further back, say 1910 / 1920 there are plenty of examples where the mechanic was actually the driver, having to fix the car during the race.
Time has moved on, technology has moved on and nowadays more drivers rely on board computers that track 100,000 data points per msec and where faults are being predicted and fixed/addressed without impact; yet technology in F1 still key; it is invisible but cannot be ignored and you would argue it’s as important as it was back in the 1940/1950 when Juan Manuel Fangio (El Maestro) dominated racing in Europe.

Technical Infrastructure – or simply infrastructure – is going through the same process. When I started to work in IT towards the end of 1980, I had to understand how the 10MB filecard was being addressed, how I could optimise the 640KB RAM and how to use every available Herz from my 8088 CPU running 8MHz. This is not necessary anymore. If you are writing code you don’t need to understand how the 12 MultiCPU with 500GB RAM addresses the 10PB of storage – its helpful if you do but when writing a Ruby based app – but not necessary.

As I noted above, IT cannot be ignored – same as a racing driver cannot ignore the engine; the gearbox, the suspension or the breaks. You have to understand its limitations and how to make the best with “IT”. And therein lies the real challenge – the speed of progress, the speed of innovation in the IT field is mind boggling. I saw this great picture on the IEEE facebook site:

There are plenty of examples and quotes on how fast technology is developing. CPU speed, RAM prices and size, new advances in physical etc is fuelling the innovation cycle where new abilities finds new applications and new requirements drives new developments….an innovation cycle that is accelerating. And this is where the term “Disruptive technology” comes in.

The term Disruptive technology was created by a Harvard Business School professor Clayton Chrisensen back in mid-1990. The co-authored an article “Disruptive Technologies : Catching the Wave” as well as wrote a book “The Innovator’s Dilemma” where he was trying to separate technology into two categories : sustaining and disruptive. Sustaining read “stable” / “established” and disruptive “not yet fully formed” / “not stable, yet”. There are many examples of disruptive technologies : PC, email, mobile phones but also LED, as well as digital photography etc.

If we have progressed so far (looking back to just say 1980 when I was developing assembler code on my 8088) how much progress will there be looking forward? Have we reached the peak or are we just about starting and there is even more innovation and evolution (in terms of technology) to come? I remember seeing a paper from MIT that stated that we are in a phase of acceleration – more innovation, evolution and progress will be archived in more condensed periods, accelerating change and new opportunities. Now I am not a scientist nor a philosopher but I do wonder how all this change will be adopted by people and where we will “end up”.

I don’t think that 2020 will be radically different – but who knows what 2035 will be like. Coming back to my earlier point though – a racing driver in 2020 will need to understand physics, aerodynamic and mechanics less than his college 100 years back – however car technology will still be key. A programmer in 2020 will need to understand less about how storage gets allocated and how best to address effective memory – however IT infrastructure will still be key; “IT” is invisible but cannot be ignored.

About the author

Gunnar Menzel, VP Enterprise Architect
Gunnar Menzel, VP Enterprise Architect
2 Comments Leave a comment
thanks for sharing I believe that by 2035 we will see a new industrial revolution.... The Industrial Internet of Heavy Things revolution, if you will. It will come and accelerate as we have seen in the telecommunications and original internet disruptive waves seen in the 1990s and early 2000s. This may even occur by 2020. Improvements in micro and wireless sensors are needed for durability and cost, but this is happening. The heavy things are needed for power generation, medical imaging, manufactured goods, transportation, etc. and these all can be made more reliable and available through health-based diagnostics and prognosticators. IT is the processing platform & physical brain, product- and operational-knowledge is the intelligence base and the heavy-goods are the body and engine. As different as one generation can transform our bodily-view on life (the Gen Now does not remember life without cellulars and Facebook), so too will the IT use and application to our key industrial products be transformed in the next generation.
Hey Jim, glad to see you are on form. Agree with you and like the original article. Other parts of the infrastructure are also developing. The middleware to that scales both up and down to tiny devices ensuring data gets through, and that we can analyse streaming data in near real time, and mine the vast lakes we are accumulating, with new off the shelf machine learning libraries. What amazes me is how fast this stuff is emrging, the pace is frightening. It virtually all open source as well. This has got to challenge the recieved wisdom about IT support and sustainability. The next few years will be really exciting.

Leave a comment

Your email address will not be published. Required fields are marked *.