I wanted to expand a little on my dismissal of Sanjiva’s argument that “The Web is necessarily human centric”.

Sanjiva, in his support for – and authorship of – WSDL, presumably wants to permit developers to publish their own service-specific interfaces, such as ones supporting methods like the canonical “getStockQuote”, or even “getRealtimeStockQuote”. And I’m certain he’d claim that these are very much machine-facing interfaces, since that’s supposed to be the whole point of Web services. So far so good?

So why is a system built around GET suddenly not machine facing? I’ve said before that the one thing that most distinguishes SOA and REST is the uniform interface of the latter; it says , in part, that the more general the operation, the more reusable the interface. In other words, using the example above, getStockQuote is more reusable than getRealtimeStockQuote. Moreover, GET is more reusable than getStockQuote.

By following that logic – that more general means more human-targetted – then one can only conclude that the methods most suited for machine-targetting will be the most specific ones. So never mind getRealtimeStockQuote, we’d need getRealtimeStockQuoteForGOOGonNASDAQ.

Of course, that’s silly. So is the argument that the Web is only for humans. I hope (hah! 8-) that this finally puts that argument to rest (pun intended).

Stu Charlton responds to my comment (and Mark’s) about how Pat Helland did things the hard way in discovering REST in a recent paper of his;

So, while the two Marks are suggesting Pat’s reached REST the hard way, I would suggest this is something he’s been saying for years, […]

That is the hard way! 8-O If Pat’s been unknowingly preaching REST constraints for years, then he’s done it from scratch. That’s a great personal accomplishment of course; I wish I were that smart. But wouldn’t it have been great if he had noticed that what he was talking about was being built out right under his nose for the past 15 years? 8-) I don’t fault him for that any more than I fault the bulk of the industry for also missing it (which is to say, a tiny bit 8-).

Anyhow, hopefully this paper can be the catalyst that helps push the industry towards a better understanding of the power and value of the Web. Of course, it also brings a new perspective to bear on the Web itself, from a seasoned distributed computing veteran, so that can only help Web proponents, perhaps motivating new Web based solutions. At the very least, they’ve got me thinking, which is always good 8-)

Eric, any chance we could get Pat to the workshop?

… is the title of a new blog post by yours truly on my consultancy’s weblog.

WS-I released the first draft of the Basic Profile today. I don’t know how they manage to write so much about something that isn’t necessary; you wouldn’t need profiles if interoperability were well defined. It seems to be purely a political move to bundle specs together so that the technology doesn’t look like it could be easily marginalized (though it can).

I mean, really, why on earth does anybody need to specify the possible HTTP response codes to be used, when HTTP clients can already deal with all of them (even if only to fallback to x00)?

And cookies for state management? Puh-leeze! Cookies are a hack, and only make any kind of sense when you’re dealing with an installed base (browsers) that support them. Remove the browser, and you could do state management properly, on the client (and only on the client) instead of the server. This is Distributed Systems 101 stuff here, sheesh.

Sigh. It disappoints me to think of all the time and money the industry is wasting on this.