This is was one news story that I had to read twice and even track down as I couldn’t believe it! However it does seem to me true, in the sense that there are multiple versions of the story covered in various places and some include some sensible sounding quotes from various Google spokespeople.
A very quick recap is that security vendor Websense has used Google binary search capability with new tools that it has developed to find various forms of Malware. The exact technique has been to search for strings used in well known Malware such as Bagel using the little known feature that Google actually looks inside the normally unreadable .exe files and indexes the results. Amazing! A little concerning too, when you read that they were able to locate more than 2,000 web sites that were filled with various .exe malicious files.
Websense is, like many others working in the new world of loosely defined Web 2.0 technologies is planning to make its Google code ‘public’, though I could not find any definition of what they mean by this statement. However, they do say that this would be to ‘a select group of security researchers’, but I fear that the secret is now out. It’s a new challenge to those who wish to demonstrate their skills in this deadly area.
Those who follow my blog will know that I am clearly watching what I perceive to be a change in the role of what used to be technology for Business under the title of IT, and what is becoming user, or consumer, use of technology under the title of ‘social computing’. This piece of news will probably justify why I am watching this change so carefully, put simply I believe it to be an unstoppable change. One that we will have to live with, but one filled with new risks and challenges to those of us whose professional duty is to provide ‘safe and reliable’ IT services to business.
What has just been kindly exposed to a whole generation of younger, and dare I say less responsible users who may well fail to see the consequences of their actions, is in a sense yet another MashUp. The speed with which Web confident users are grasping this whole new idea of building a personalised view by combining several web sources, often located through Google or other search engines is amazing, and frightening.
Amazing in terms of what you can do, and frightening in the lack of control over the sources. The source and reliability of data inputs in terms of the accuracy of the final results is one of the oldest concerns of computing, known to me over thirty years ago as GIGO, Garbage In, Garbage Out. Well that’s the user’s problem, unless they feed the resulting outputs into the Business and corrupt everyone else’s data, then it’s our problem. But what if they unknowingly build in an apparently great piece of data which is actual Malware? Worse what happens if they deliberately do it, and build a website that’s a honey trap for MashUp seekers?
So far the only potential help I have heard of to ‘manage’ this situation is from IBM who seem to recognise the issues and the needs. I can only say at this stage we should all consider ourselves as warned and watch developments closely.