I gave a presentation tonight at the XML Users Group of Ottawa, titled REST, Self-description, and XML.

Not unexpectedly, the slides don’t capture a lot of what was presented (and nothing of what was discussed), but there’s a story in there that should be easy to follow. It also has a surprise ending that caught at least one person off guard. That was my objective.

That’s two in the past couple of weeks; the RESTwiki, and the ESW Wiki.

If they didn’t hack into the machines and change the MoinMoin database, then these have to be the lamest defacers ever.

Tim Bray writes;

When you’re explaining something to somebody and they don’t get it, that’s not their problem, it’s your problem.

Well, sorta. Let’s test it out.

Bob; Hey Jim

Jim; Hey Bob

Bob; Hey, your house is on fire!

Jim; Eh? My what’s on what?

Bob; I’m getting out of here! Best of luck with that fire.

Jim; Eh?

A nice post from Don last weekend, addressing the “roach motel” (aka “application silo”) problem, and what Longhorn’s doing to help developers who want to avoid it. Some comments;

Though I think their characterization of RPC is a bit naïve (NFS is a great counterexample of a broadly adopted RPC protocol), the argument in favor of common operations is a strong one that I’m extremely sympathetic to (watch this space).

NFS is built on an RPC infrastructure, but it’s not what you’d call RPC because its users don’t define the interface, the protocol does. Consider that just because it’s built with RPC, you don’t see it integrated with other RPC based services. I think there’s an important lesson there.

What the REST argument conveniently sidesteps is that had it not been for HTML (a common schema), HTTP (a common set or operations/access mechanisms) would have never registered on most people’s radar.

I don’t know about others, but I’ve never side-stepped that issue. I’m quite up front when I claim that REST alone doesn’t address the “schema explosion” problem, and that HTML is only a “unifying schema” for humans. I commonly follow that up with an explanation of why I like Semantic Web technologies, as they extend the Web to address the explosion problem for automata.

Anyhow, I’m very encouraged by the positive feedback, and will be keenly “watching that space”! Thanks, Don.

Dave Orchard wrote, and Don Box concurred, that it’s a good thing to avoid registration at the likes of IANA and IETF. I also concur, as my hopefully-soon-to-be-BCP Internet Draft with Dan Connolly describes.

Where I disagree with Dave and Don, is summed up by Dave;

XML changes the landscape completely. Instead of having a small number of types that are registered through a centralized authority, authors can create arbitrary vocabularies and even application protocols through XML and Schema. In the same way a client has to be programmed for media types, a client must be programmed for xml types and wsdl operations.

IMO, XML doesn’t change the landscape in that way at all. It’s always been possible to have an explosion of data formats and protocols; 10 years ago you could have done it with ASCII and ONC or DCE. The fact of the matter is that we don’t see these things on a large scale on the Internet because most people don’t want them. Not only is it expensive to develop new ones – even with a fine framework for their development, such as SOAP & XML Schema – but you’re very typically left amortizing that expense over a very narrowly focused application, such as stock quotes or shoe ordering, or what-have-you. The Web and Semantic Web efforts are an attempt to build a supremely generic application around a single application protocol (HTTP) and a single data model (RDF). Now that’s landscape-changing.

I’m enjoying Tim Bray’s technology predictor success matrix series very much. It gave me an idea too. I hereby present the “Internet scale distributed system predictor success matrix, aka ISDSPSM.

The predictor I’m going to use is “Uses a constrained interface”. That is, whether or not the interaction semantics between components are constrained in any way. The technologies are a mix of systems that attempted to be deployed on the Internet at one time.

Winnersscore Losersscore
Web 10 CORBA 0
Email 10 DCOM 0
IM 10 DCE 0
IRC 6 ONC 0
Napster 7 RMI 0
DNS 10 Linda 10

I included Linda to show that constrained interfaces are not a sufficient condition for success. But they sure seem necessary, wouldn’t you say?

WS-Eventing, from BEA, MS, and Tibco.

The good news is that finally, we’ve got a Web services spec that tackles the hard problem, interfaces. Yes, WS-Eventing is an application protocol (do we have to bind SOAP to it too?)

The bad news is that it continues the track record of abusing other application protocols (if indeed it’s objective is to be used with them, which I can only assume is the case); the more that goes in the SOAP envelope, the less suitable it is for use with them, as it just duplicates what they already provide. Once again we see WS-Addressing as the culprit; wsa:Action and wsa:To exist to replace their counterparts in any underlying application protocol. For example, for HTTP, the counterparts of wsa:Action and wsa:To are the request method and request URI, respectively.

A point of frustration for me is that the semantics of subscription itself are largely uniform. What I mean is, wouldn’t it be great if all Web services supported “subscribe”? So why not use HTTP like I’ve done, which is built for uniform semantics? Using a SOAP envelope with MONITOR and for the resultant notifications would be really sweet.

One pleasant surprise is the form that notifications take, as in this example. Notice the use of wsa:Action; the value is no longer a method, but is instead a type. Woot! That’s the first time I’ve seen the “action” semantic used properly in any Web services work. Presumably this is due to notification semantics being entirely geared towards simply getting some data to some other application; basically, POST. Of course, technically, I don’t believe any “action” is required in this case, as there’s no intent on behalf of the notification sender beyond simple data submission; the intent is determined and assigned by the recipient of the notification. But that’s still progress!

Another upside is the use of fine grained URIs for identifying the endpoints, e.g. “http://www.other.example.com/OnStormWarning”, rather than something like “http://www.other.example.com/RPCrouter”.

Overall, very disappointing from a protocol and pub/sub POV, but the progress on resource identification, uniform semantics (even if it’s accidental 8-), and intent declaration is quite encouraging. Perhaps the next attempt will be done as an HTTP extension with a SOAP binding to MONITOR (the existing binding to POST would suffice for notifications).

Dave Orchard wonders how XQuery might be put on the Web.

My position seems to fly in the face of at least one part of Dave’s position;

But clearly XQuery inputs are not going to be sent in URIs

Why admit defeat so easily? Did the TAG not say, “Use GET … if the interaction is more like a question ..”? Well, aren’t XQuery documents questions? I think it’s quite clear that they are, and therefore XQuery would benefit from being able to be serialized into URI form. That’s not to say that all XQuery documents would benefit, but many could.

I got a lot of pushback from some in the XQuery WG when I suggested this to them a few months ago, but I think the TAG finding is quite clear. I also strongly believe that doing this is the only significant way in which XQuery can be “put on the Web”.

On the upside, Dave says good things about the value of generic operations;

I first tried to re-use the Xquery functionality rather than providing specific operations in the SAML spec. My idea was that instead of SAML defining bunch of operations (getAuthorizationAssertionBySubjectAssertion, getAuthorizationAssertionListBySubjectSubset, ..), that SAML would define a Schema data model which could be queried against. A provider would offer a generic operation (evaluateQuery) which took in the query against that data model.[…]

Of course, while you’re generalizing, why not go a little further and just use “POST” (suitably extended) instead of “evaluateQuery”?

I also like what Dare had to say about this too, in particular;

One thing lacking in the XML Web Services world are the simple REST-like notions of GET and POST. In the RESTful HTTP world one would simply specify a URI which one could perform an HTTP GET on an get back an XML document. One could then either use the hierarchy of the URI to select subsets of the document or perhaps use HTTP POST to send more complex queries. All this indirection with WSDL files and SOAP headers yet functionality such as what Yahoo has done with their Yahoo! News Search RSS feeds isn’t straightforward. I agree that WSDL annotations would do the trick but then you have to deal with the fact that WSDL’s themselves are not discoverable. *sigh* Yet more human intervebtion is needed instead of loosely coupled application building.

Heh, good one, especially the use of the “human argument” against Web services. 8-)

When it comes to predictions, I like to put it on the line and make mine measurable. As published at SearchWebServices, my predictions this year are two;

  • Web services will continue to struggle to be deployed on the Internet. I’ll restate an earlier prediction I made this year; that by the end of 2004, the number of parties offering publicly available non-RESTful Web services (as registered with XMethods.net) will have plateaued or be falling.
  • Another high profile public Web service will be developed in both REST and Web services/SOA styles, and again — as with Amazon — the REST based service will service at least 80% of the transactions.
Jon Udell responds to Stefano Mazzocchi’s comments on an earlier column of Jon’s. Stefano wrote;
Marketing, protocol and syntax sugar aside, web services are RPC.
to which Jon responds;
I disagree. It’s true that Web services got off to a shaky start. At a conference a couple of years ago, a panel of experts solemnly declared that the “Web” in “Web services” was really a misnomer, and that Web services really had nothing to do with the Web. But since then the pendulum has been swinging back, and for good reason,Learn about rep management. Much to everyone’s surprise, including mine, the linked-web-of-documents approach works rather well. Not just one-to-one and one-to-many, but also many-to-many. Adam Bosworth’s XML 2003 keynote was, for me, the most powerful affirmation yet that Web services can and should leverage the Web’s scalable spontaneity. That’s the vision firmly planted in my mind when I talk about Web services.
I’m reminded of a picture Don Box linked to a few weeks ago. A dog dressed as a clown is still a dog. Until Web services embrace a constrained interface (I’d recommend this one, they will always be RPC.