Tuesday, October 29, 2002
Just been reading an article about the whole web services hype thing. Good quote:
In some ways, vendors need Web services more than their customers do. They're desperate for a new technology that will boost slow sales in a bad economy.
I think this is the key point about the current interest in them. I keep asking myself what do web services (w.s.) provide that previous technologies didn't? Why did previous attempts fail, and why will w.s. succeed?
First, previous attempts at distributed, possibly cross border (ie different companies), cooperation (no particular order):
- HTML scraping
- CORBA
- DCOM
- XML/HTTP RPC
These have all been successful to a limited degree, though none are the defacto standard for web comms. Why? I think DCOM and CORBA have failed outside the corporate network due to poor firewall abilities / standards. CORBA also suffered due to lack of MS support, a significant factor these days, and one that may mean SOAP can mature to become as good as it's predecessors and more widely used. HTML scraping is obviously unstable, though more due to not actually working with the data provider on a data link.
I feel the XML nature of SOAP means that the feasible granularities of components will remain higher than those possible with a binary encoding. That may not seem too important, but I think it's one of the reasons distributed systems have not delivered their full, envisaged potential. I've worked in a number of places where CORBA or EJB are touted as the future architecture for all systems, to enable the componentisation of all significant functionality and eventually new systems to be assembled from pre-developed / tested components. This rarely happens for a number of reasons. Firstly, most systems are not designed with this level of reuse in mind. Usually fitness for (one) purpose at time of buiding is the main (or only) concern. Getting the granularity right for these components is difficult. Most of the time, the code is ripped open, parts rewritten, and if you're lucky then some code reuse is possible. Restricting the choice of granularity due to expensive transports is not going to help.
There, though, is the point. SOAP or XML-RPC or CORBA or whatever, is only the transport - one layer out of the whole puzzle. I don't think distributed systems have failed so far due to poor transport implementations. Each had it's advantages, and sore points, but that didn't cause the whole project to fail / succeed. Until the industry stands back from the SOAP hype, and thinks hard about how to simplify both the building of reliable and scalable distributed systems and the correct way to partition them for re-use, rather than just hype a given transport / data encoding, no real progress will be made.
Building these systems has always had an element of black art about it, and maybe it always will. Standardisation of the transport helps, but it's not a silver bullet. If standardisation was so important, then we'll have to wait a long time to get SOAP up to the functionality levels of proven systems like CORBA, by which time the marketing machine will probably have moved on to The Next Big ThingTM.
As mentioned in the article, SOAP is the current fashion, has the backing of MS and non-MS groups, which in itself is an achievement. Due to the hype and attention it's received, it has the perception of being easier than previous solutions. It probably therefore makes sense, on new developments, to give it a go, but as I've said, don't expect miracles.
Saturday, October 19, 2002
- Installed a PCMCIA wireless card on my laptop -- which promptly blue screened, rebooting several times, until I ejected the card. For some reason I had a Windows 2000 version of a dll, just one of them, that made the installer install the wrong GUI dll and other bits of the card driver. Still not sure I've got all the right bits there.. DLL hell, I hate windows for it. A friend suggested going to XP to solve that issue:
- That's a cop-out
- Don't want to upgrade my flakey OS to another MS one that'll involve dodgy EULA's. More likely to convert it to some Linux distro..
- I've only got 128MB RAM on this baby, XP would kill it, it's slow enough as it is..
- Now I've got the card working, I had a bit of a mare getting the network routing table to ignore the docking station's ethernet card, and use the wireless card to contact the lan instead. Must buy a decent IP book sometime. All the web pages I found either didn't help much or only used the NT version of
route
. - ADSL: got an email from my ISP saying anytime up until 6pm. Spent ALL day checking the line. No change. Couldn't someone have told me the phone would still work? Still not sure how it's all working -- does the phone's limited signal response just auto filter? Dunno. Nice speed up for downloads though.. just got to find some of the hallowed broadband content we keep getting told about.
- Speaking of which... BBCi Broadband. Only available to certain ISPs! Eh, haven't they heard about the net? How it works? Sent them a strongly, ahem, worded feedback rant. This is from their FAQ:
Why do I need to sign up to a particular service provider to access BBCi Broadband?
The BBC intends to expand the list of availability to include as many UK broadband service providers as mobiles. Please check availability with your broadband service provider. [BBCi FAQ]
What tosh.
The habit of reading loads of other peoples blogs is scarily time consuming. I've downloaded Peerkat as a first level tool. Quite nice, but was annoying to load up the various feeds I wanted. So, sleeves up, I dabbled in the art of python to scan my Moz bookmarks and load all feeds in a certain folder.. cool :o)
I see Rael has got some other tools, think I'll progress and see what else he's got to offer... maybe use a bit of Movable Type too.. cos if Mark uses it, must be OK :o)
I got the subversion up and running. I'd been kinda hoping that avoiding building EVERYTHING from src would have been possible, in the end I had to do the full apache dev source, and the subversion stuff. It all then just worked... cool.