Over the weekend, I found a great article by Steve Loughran on the impact of Sitefinder on Web services tools. It’s very well written, and covers the entire space.

I was particularly impressed by the indepth analysis of the issues regarding BCP 56 (aka RFC 3205) “On the use of HTTP as a Substrate”, which Verisign used to defend its position. Steve saw right through that, which is easier said than done; there’s a lot of subtlety there, as the XML Protocol WG response to the IESG (which I co-authored) describes.

Norm asks;

Given a random XML document, one of the things you might want to know about it is, “What is an appropriate schema to apply”? Now, for DTDs, this is a simple question: look at the document type declaration. If it has one, the public and system identifiers will tell you what DTD to apply; if it doesn’t have one, there’s no DTD to apply.

Sounds like a job for RDDL.

A good and seemingly fair comparison of different approaches to using Amazon Web services in PHP, via SOAP and REST.

I’d previously heard Tim O’Reilly’s 85% number (relayed via Jeff Barr), but just learned of another interesting stat in that article; Amazon’s RESTful queries (via GET) are 6x faster than their non-RESTful ones (via POST). That seems high to me, but not having any more details in front of me, it’s hard to dissect why that might be the case. Anyone got the details?

Along with most everybody else I imagine, I had a look over the Avalon/WinFS stuff from Longhorn this week. Jon Udell sums up my position better than I could;

Yeah, “embrace and extend” was so much fun, I can hardly wait for “replace and defend.” Seriously, if the suite of standards now targeted for elimination from Microsoft’s actively-developed portfolio were a technological dead end, ripe for disruption, then we should all thank Microsoft for pulling the trigger. If, on the other hand, these standards are fundamentally sound, then it’s a time for what Clayton Christensen calls sustaining rather than disruptive advances. I believe the ecosystem needs sustaining more than disruption. Like Joe, I hope Microsoft’s bold move will mobilize the sustainers.

Yup, bingo. I was shocked when I realized that they were completely reinventing the wheel here for no (really) good reason … except that somebody high up figured, as Jon says, that the Web was “ripe for disruption”. As much as I dislike many of MS’s business practices, I have the utmost respect for the company and the people there. But man oh man, what a stinker this stuff is. Remember Blackbird? Did these guys forget that they own the browser? If they had done this properly, they could have had the rest of the industry playing catch up to their Web extensions for the next five years or more. What an enormous gaff. Wow.

Just as an example of some things that they could have extended the Web with, consider these;

  • client-side containers for stateless session management; requires HTML extensions (drag-and-drop, events, etc..)
  • Web-browser-as-server for async notification; ala mod-pubsub
  • Advanced forms (XForms/Infopath/RDF-Forms); that Infopath is stuck in Office land is criminal
  • Web-friendly structured storage, where URIs are file names (yes, I meant it that way around)
  • Better HTTP proxy integration via per-request message routing, rather than per-connection routing which we currently have

All but the fourth require non-trivial, visible extensions to the Web … and the W3C and IETF aren’t currently touching them (except for forms).

A fine description of why we prefer our URIs to be treated opaquely, in the context of a brain-dead spam-busting idea by a UK MP.

Because I needed to use this myself, but also because I wanted to flaunt the simplicity and expressiveness of RDF Forms, I whipped together a little spec …

URI Proxy provides a means of describing a RESTful service as being a “proxy” (well, kinda – read the spec) for other RESTful services. Composition via aggregation.

You’ve seen examples of services like this before; Google’s cache (example – though the full URI isn’t used, which sucks), Google’s translator (example) the Web archive (example), etc..

Apparently that needs to be reiterated. Interesting.

Computers->Programming->Internet->Web Services->REST. Neat.

I’m in the middle (or perhaps its the end) of a conversation with Savas Parastatidis regarding a “radical” new form of Web service constraint. Consider his plan to send along this as a message;

<orderForm1>
  <carOrMotorcycle>car</carOrMotorcycle>
  <colour>blue</colour>
  <cc>1800</cc>
  <model>blabla</model>
</orderForm1>

Sure, there’s no symbol visible that can be mapped to an operation, but in order to understand what is being asked of a recipient of that message, something somewhere has to say what is being asked. Otherwise it would inherit the semantics of any underlying application protocol that transferred it – consider that it would mean different things to send it with HTTP PUT and POST – and therefore wouldn’t really be a message, just a piece of a message (representation anyone?).

Based on what Savas explains he wants that message to mean though, what he describes is semantically equivalent to HTTP POST, only without URI targetting, and with an implicit operation rather than an explicit one. I claim that this is a better version of his message;

POST some-uri HTTP/1.1
Content-Type: application/order-form+xml
<orderForm1>
  <carOrMotorcycle>car<carOrMotorcycle>
  <colour>blue<colour>
  <cc>1800<cc>
  <model>blabla<model>
<orderForm1>

It’s pretty funny to see pieces of Web architecture being rediscovered by Web services folks. Once the enormity of the big picture of the Web hits home with them, well … let’s just say that there’s a whole lot of walls on which I’d love to be a fly.

P.S. more uniform interface goodness from Bill.

BEA CTO Scott Dietzen gave one of those typical Q&A interviews in which he says a lot of the usual stuff you’d expect him, or any Web services proponent, to say about Web services. But he does say something interesting about UDDI;

I took an informal survey of our highest-end customers. We had our top 50 or top 40 end users in a user group together and I was astonished that all of them claimed they had transactional Web services in production and 60 to 70 percent had transactional Web services running between .NET and WebLogic in production. The UDDI[sic] adoption was much less.[sic] People were basically using URIs and URLs as their model for locating and binding into Web services.

and later,

I think it’s also clear that UDDI has been oversold.

Right. So Paul and I were right then?

I think there’s a critically important point being explained here; even in the face of hype-up-the-wazoo, customers will generally opt for simplicity. This is why it is possible for Web services/SOA to fail. All it takes is for customers to understand how to do things the simpler way. The use of URIs and GET is really very simple, so there’s no surprise it was first. Knowing how to use POST and URIs (without operations in the body) is tougher. But it’s just a matter of time before they figure it out, IMO.