Can I really ignore infrastructure? Or do I have to still consider it? If so, why?
Over the past couple of weeks I focused a lot on the future of Infrastructure – the Invisible Infostructure (see here). For many invisible means you can ignore Infra – I do not believe that this is, nor that it will be the case.
Let’s draw a parallel with motor racing. There are many examples of great racing drivers who also had to understand exactly how a car functions. For instance Nikki Lauda is famous for being an excellent car mechanic as well as a brilliant driver. If you look back to the start of car racing, most, if not all of the drivers had to be a good mechanic as technology was not very reliable and stable. Look at the famous Silberpfeil in the 1930 (the W125). A massive I8 (straight 8 cylinder) engine with over 5.6ltr ccm, producing upto 650BHP, enough to reach over 300 km/h.
Drivers of the W125, like the great Rudolf Caracciola, had to understand the mechanics to ensure the car was being improved and that before each race the car could be prepared. If you look even further back, say 1910 / 1920 there are plenty of examples where the mechanic was actually the driver, having to fix the car during the race.
Time has moved on, technology has moved on and nowadays more drivers rely on board computers that track 100,000 data points per msec and where faults are being predicted and fixed/addressed without impact; yet technology in F1 still key; it is invisible but cannot be ignored and you would argue it’s as important as it was back in the 1940/1950 when Juan Manuel Fangio (El Maestro) dominated racing in Europe.
Technical Infrastructure – or simply infrastructure – is going through the same process. When I started to work in IT towards the end of 1980, I had to understand how the 10MB filecard was being addressed, how I could optimise the 640KB RAM and how to use every available Herz from my 8088 CPU running 8MHz. This is not necessary anymore. If you are writing code you don’t need to understand how the 12 MultiCPU with 500GB RAM addresses the 10PB of storage – its helpful if you do but when writing a Ruby based app – but not necessary.
As I noted above, IT cannot be ignored – same as a racing driver cannot ignore the engine; the gearbox, the suspension or the breaks. You have to understand its limitations and how to make the best with “IT”. And therein lies the real challenge – the speed of progress, the speed of innovation in the IT field is mind boggling. I saw this great picture on the IEEE facebook site:
There are plenty of examples and quotes on how fast technology is developing. CPU speed, RAM prices and size, new advances in physical etc is fuelling the innovation cycle where new abilities finds new applications and new requirements drives new developments….an innovation cycle that is accelerating. And this is where the term “Disruptive technology” comes in.
The term Disruptive technology was created by a Harvard Business School professor Clayton Chrisensen back in mid-1990. The co-authored an article “Disruptive Technologies : Catching the Wave” as well as wrote a book “The Innovator’s Dilemma” where he was trying to separate technology into two categories : sustaining and disruptive. Sustaining read “stable” / “established” and disruptive “not yet fully formed” / “not stable, yet”. There are many examples of disruptive technologies : PC, email, mobile phones but also LED, as well as digital photography etc.
If we have progressed so far (looking back to just say 1980 when I was developing assembler code on my 8088) how much progress will there be looking forward? Have we reached the peak or are we just about starting and there is even more innovation and evolution (in terms of technology) to come? I remember seeing a paper from MIT that stated that we are in a phase of acceleration – more innovation, evolution and progress will be archived in more condensed periods, accelerating change and new opportunities. Now I am not a scientist nor a philosopher but I do wonder how all this change will be adopted by people and where we will “end up”.
I don’t think that 2020 will be radically different – but who knows what 2035 will be like. Coming back to my earlier point though – a racing driver in 2020 will need to understand physics, aerodynamic and mechanics less than his college 100 years back – however car technology will still be key. A programmer in 2020 will need to understand less about how storage gets allocated and how best to address effective memory – however IT infrastructure will still be key; “IT” is invisible but cannot be ignored.