Web 2.0: Take 2.0

The following are the gist of remarks made to a colleague on Web 2.0. In these remarks I try to get at the cause for Web 2.0 and the delay in its naturally happening:

I really enjoyed your editorial on Web 2. I am always a bit leery when some parties creating a new conference are able to arbitrate what is is the new meme or concept – there is often tendency to do some self-interested shaping. But also I know if anybody should get a dispensation it is Tim OReilly. But the pragmatics of the situation is that a lot of developers have been doing Web 2.0 type things for the past 3-4 years – RIA vendors like Lazslo, Nexaweb, Droplets, General Interface, Isomorphics, and of course Macromedia with Flash immediately come to mind. But the BI vendors are doing amazing things on the Web desktop and through portals while the IM/Messaging/Smart connected are bursting all over the gaming, mobiles, and embedded scene with Web-Me-2s. See my coverage here for many of these trends.

But a legitimate question to ask is why, given all of these vendors working on Web 2.0 caliber software for at least 3-6 years, has not Web 2.0 flourished at least 1-2 years earlier ? A big part of the reason was the excesses of Web 1 VC investments – throwing money at any outlandish idea til the bubble burst in 2000. However let me take a stab a stab at some other, more technical but also important gatekeeper reasons for Web 2.0 being log-jammed.

The unlogjamming of the Web allowing the transition to Web 2 has been due to three often neglected factors:
1)the emergence of Mozilla, Safari, Opera and others as viable browser alternatives to the tune of 20-25% market share(what I see on my websites which take a little over 1 million hits per month)and growing. This along with Google, Yahoo/Flickr, and others use of innovative, fast DHTML/Ajax, these two key factors have helped to end The Great Constipatio … uhh Consternation on the Web.

The Great Consternation

The Great Consternation has been Microsofts refusal for the past 5 years to allow any updates to IE including completion of promised W3C standards while promising technologies such as XFORMs, SVG, XUL, JPEG2000, XML Namespaces, CSS 2.1 and 3, JavaScript E4X and 2.0, XPath 2 and many others have been neglected or shunted aside for the Microsoft defacto replacements: SVG, XUL -> XAML; XPath 2, XSLT -> MS XQuery and LINQ; Java, JavaScript -> JScript, C#, VB.NET. But loss of 90%++ market share and the emergence of Yahoo and Google as fast rising desktop service powers has forced Microsoft to change its game plan drastically. The protected Gate-d computing community, the Gates of Windows Vista, has now to compete a lot sooner in providing Web Services and broader Web connectivity than anticipated – and so now we see a new IE browser that is more standards cognizant and the new Quartz Web Designer which promises (in their own words):“to create accessible, standards-conformant Web sites by default or configure flexible schema settings to support all combinations of HTML, XHTML, and CSS standards as well as browser schemas. Use built-in compatibility and accessibility checkers to ensure your sites will render in any browser.” Amazing about face.

2)The next factor is the emergence of the Server as the Presentation layer delivery agent of 1st choice for various flavors of UIs on a wide range of devices. Cognos BI servers deliver not just highly formatted and interactive reports over the Web but also Web-based designers that create those reports. SAP delivers inventory interaction and data entry forms from servers that decide the final look and feel of those forms based on what device they are talking to. In the delivery of the presentation layer services, the server is fast evolving as the final authority, even in offline operations(last UI update came from the server, the first priority is to get back online and communicate/replicate gathered results).

3)While the Grand architects of SOA and Web Services plan … plot… plod along, data-safe Web Services are exploding in usage on the Web. By data-safe,we mean apps are using Web Services that do not require critical/privacy sensitive data but rather are primarily read-safe transactions. The Service requester has carefully engineered transparency and innocuousness such that the info being passed to the Web Service, even if it might be stored, aggregated, and even passed on to third parties- will pose no problems to the requester. The data is relatively benign, discrete, and/or shielded such that its interception or behind closed portals extra use is of minor significance. I dont care if Google or Amazon are aggregating the nature of my websites request for information. But I do care how much it costs (free or a pittance preferred) and how quick and reliably it is delivered. I can and do disguise the requests for stock quotes from Bloomberg. And if the government wants to track my clients weather and crop information requests – more power to the DOA guys and gals. But even use of data-safe Web Services is exploding in richness as players on both sides of the “envelope” find different ways to monetize the service requested and rendered.

So despite the efforts of the Worlds largest software firm – and even many of the also rans – to shape the Web and delivery of Web Services to their advantage, the Web Too has its own consistencies and rationales. Web users tend to drive Web systems to free, and if not free, fair and equitable in their costings. Its like arbitrage in financial markets, overtime the making of the markets drives towards long term risk/return profiles. In the case of the Web, long term adoption of systems and technologies is tempered and shaped into systems not only expected to be reliable and trusted/secure but also fair and equitable in their distribution of costs and rewards.

Pin It on Pinterest