So I was thinking last night about how far – or not – we’d come in the whole “Web vs. Web services” debates. In one respect we’ve come a long way; you hardly ever hear the argument that “the Web requires humans!”. Many (but still not all) people remain indifferent about that; that the Web may or may not be usable for this, but it’s moot anyhow, because the “World isn’t going that way”. But that’s still pretty good, as it shifts the discussion into the more concrete and less subjective realm of software architecture, allowing us to use reasonably well understood means of evaluating and comparing architectures for suitability to a particular problem domain.
But on the other hand, the Web still doesn’t get the respect it deserves from a lot of folk as a serious distributed computing platform. I’ve just been reviewing some papers for Middleware 2004, and some of them talk about a variety of distributed computing platforms, yet all fail to mention the Web as a peer.
There’s been a lot of low points, obviously, over the past four or five years, but a few highlights too. Some of the latter include;
- the XML Protocol WG agreeing – after much lobbying from Henrik and myself – to use HTTP error codes for the transfer of SOAP faults
- the TAG finding on When To Use GET, though I haven’t seen a Web service that uses it that way yet
- Roy Fielding‘s defense of the uniform interface
- Atom rejecting calls to define a SOAP API and instead going with a RESTful design
- Amazon saying that 85% of their “Web services” traffic is via the RESTful interface
- Jim Webber and Savas Parastatidis supporting a uniform document submission semantic, dubbed processThis