When Nicolas Carr posed the question ‘does IT matter’ in the Harvard review some years back it unleashed a hail of comments, mostly from the IT industry, saying the this was not a matter even for debate, of course IT mattered!

Well I agree, (as you might expect), but possibly not in the way that you think I might! At the risk of starting another hail of abusive mail I am going to suggest that we all missed some very important underlying trends which are joining together to now make a real impact.
First, why do we say IT? The term is only linked with the fifteen years or so following the arrival of the PC and ubiquitous adoption around making information available rather than ‘computing’ cycles. Is the kid texting in the street using IT, or the mum at home on the Internet shopping for something special an IT user? I think not.
A Literacy test today might well be said to include so called ‘IT’ skills, but in reality these are merely skills for living in the 21st century. Actually I am beginning to think that we should refer to the emerging technologies including Web 2.0 collectively as ‘CT’, standing for Collaboration Technologies. But why has there been such a rapid adoption of ‘technology’, whether IT or CT, outside, the business, or work place? Well, it sure isn’t based on building business cases for justification around the cost benefits! Simple answer is that individuals find it easy to work out if a gadget feels like value to their lives. However, the definition of value in the eyes of an individual is wide ranging; from a gadget that’s fun, through to serious need to have a mobile phone in today’s society.
Enterprises cannot handle the individual’s complex views of ‘value’, but are very keen on the simple to understand reality of cost. So now we have a broad and ever widening segment of the population who is ‘technology literate’ using technology increasingly, and unthinkingly, to improve their lives, coming to work to meet up with technology that will impact on them for say 30 to 40 hours a week that is, what shall we say? Good enough? More likely ‘just adequate’ might be more accurate description.
This leads me to conclude that Nicolas asked the wrong question. IT, or at least technology, that lets us function better does matter, but is it to the enterprise on the basis of cost, or to the individual on the basis of value? Lets take the example of a call centre to illustrate the point; if the centre handles more calls per hour through using technology to increase ‘efficiency’ of existing call procedures then it’s a cost based justification; more calls being converted to sales bookings is a value improvement. How might this be achieved? Well just may be if the operators have more chance to collaborate with the callers and work together to find the optimum solution by having better information matching capabilities this might happen. It’s a human variation on the successful processes that Amazon has used for book selections, or a low cost airline uses for flight time versus cost optimisation.
This is a real Web 2.0 model where the collaboration environment is based on external world standards, and expectations, rather than internal enterprise views of the way to take orders. The external customers and the internal staff are effectively ‘users’ in the same ‘world’ with the same technology. This rather suggests that just as some skilled staff often move cards from corporately supplied cell phones to their own preferred private smart phone, there may be an increased move to using the ‘technology’ that the user prefers.
Will we see skilled workers demanding they can use their own tools and be compensated for their use? Actually it’s already happened in on global energy business with over 1000 people now being paid a monthly sum on their corporate Amex card to cover the provision of their choice of tools. Anyone for a Mac?