As you may have gathered by now I am a big believer that standards play a crucial role in the current wave of technology, and that this has allowed what I have called ‘shadow IT’, (parallel activities to the official IT systems), to develop with reasonable safety. Perhaps more accurately this can be described as ‘technology literate user driven IT’, as I am far from suggesting that all users are either capable, or should be allowed to do what they want. The big safety net to me is standards, as this allows users to be given more personal flexibility in how they choose to use technology whilst ensuring that the poor CIO can have some piece of mind by knowing that the standard establishes some boundaries to behaviour.
It’s therefore somewhat alarming to be watching RSS fragmenting into multiple so called standards. RSS has been one of the building blocks of the whole Web 2.0 approach, and today it is increasingly coming into use as a core mechanism in designing dynamic information distribution systems within our enterprises. A look at Wikipedia shows the issue somewhat too clearly. At the end of the first introductory paragraph they list three major families and group the versions together as four separate release specifications; RSS 0.9, 0.91, 1.0 and 2.0.

This is bad enough, but a more detailed read through shows some other possible variants, I reckon on a quick count there are eight possible versions. Okay so many are interoperable, but this is feature creep on an alarming scale. We are all familiar with the damage that ‘code forking’ introduces, and here is the same thing right under our eyes in something that you may well have used to read this post. The IETF, (Internet Engineering Task Force) has established the Atom syndication format with the gaol of sorting this out, but the reality is that publishers should offer some guidance as to what hitting an RSS button means you will get.
Right now the subscriber cannot be sure, some of you will be able to join in at this point with your own personal experiences, it could be a simple headline and one sentence summary, or a massive download including an embedded video. Neither is necessarily wrong, it’s more a case of matching expectations. In time I have little doubt it will get sorted out through a standard set of features, but right now may be the moment for a little more clarity and restraint at the deployment level.
Talking of WiKipedia, and the major role it plays as a dynamic encyclopaedia in our fast changing world leads to the second nasty shock, which on reflection I guess was inevitable. One of the other key values of Web 2.0 is that all of us are smarter than some of us, and this underpins the whole principle of WiKipedia with its contributions format. On reflection it should have been obvious to ask the question more carefully concerning the qualifications of the editors to decide what was accurate, and what was hype, bias, unproven, etc. Pretty tough role to be performed by a volunteer, after almost so called ‘experts’ are too busy being paid for being a whole lot of experts to be available to take this on.
Shouldn’t have been a surprise therefore when an editor was caught out to have fake credentials, and interestingly it did happen due to users noticing that his work was suspect. Somehow it was a surprise as I guess I just believed too much in the honesty principle. WiKipedia are promising to take steps to stop this happening again by improving the process of checking potential editors but this, and the RSS standards, may be the first signs of the inevitable shift from hype to reality. You will notice I did not say disillusionment as that’s not the case, more just a question of practical issues as these things go main stream.