Tuesday, October 29, 2002
Just been reading an article about the whole web services hype thing. Good quote:
In some ways, vendors need Web services more than their customers do. They're desperate for a new technology that will boost slow sales in a bad economy.
I think this is the key point about the current interest in them. I keep asking myself what do web services (w.s.) provide that previous technologies didn't? Why did previous attempts fail, and why will w.s. succeed?
First, previous attempts at distributed, possibly cross border (ie different companies), cooperation (no particular order):
- HTML scraping
- XML/HTTP RPC
These have all been successful to a limited degree, though none are the defacto standard for web comms. Why? I think DCOM and CORBA have failed outside the corporate network due to poor firewall abilities / standards. CORBA also suffered due to lack of MS support, a significant factor these days, and one that may mean SOAP can mature to become as good as it's predecessors and more widely used. HTML scraping is obviously unstable, though more due to not actually working with the data provider on a data link.
I feel the XML nature of SOAP means that the feasible granularities of components will remain higher than those possible with a binary encoding. That may not seem too important, but I think it's one of the reasons distributed systems have not delivered their full, envisaged potential. I've worked in a number of places where CORBA or EJB are touted as the future architecture for all systems, to enable the componentisation of all significant functionality and eventually new systems to be assembled from pre-developed / tested components. This rarely happens for a number of reasons. Firstly, most systems are not designed with this level of reuse in mind. Usually fitness for (one) purpose at time of buiding is the main (or only) concern. Getting the granularity right for these components is difficult. Most of the time, the code is ripped open, parts rewritten, and if you're lucky then some code reuse is possible. Restricting the choice of granularity due to expensive transports is not going to help.
There, though, is the point. SOAP or XML-RPC or CORBA or whatever, is only the transport - one layer out of the whole puzzle. I don't think distributed systems have failed so far due to poor transport implementations. Each had it's advantages, and sore points, but that didn't cause the whole project to fail / succeed. Until the industry stands back from the SOAP hype, and thinks hard about how to simplify both the building of reliable and scalable distributed systems and the correct way to partition them for re-use, rather than just hype a given transport / data encoding, no real progress will be made.
Building these systems has always had an element of black art about it, and maybe it always will. Standardisation of the transport helps, but it's not a silver bullet. If standardisation was so important, then we'll have to wait a long time to get SOAP up to the functionality levels of proven systems like CORBA, by which time the marketing machine will probably have moved on to The Next Big ThingTM.
As mentioned in the article, SOAP is the current fashion, has the backing of MS and non-MS groups, which in itself is an achievement. Due to the hype and attention it's received, it has the perception of being easier than previous solutions. It probably therefore makes sense, on new developments, to give it a go, but as I've said, don't expect miracles.