Norm responds to a
mine about why I felt that better technology, and not necessarily new standards, were
what was required to solve the problems that XML Catalogs were trying to solve.
He offers three things that he believes can’t be done with caches, but can be done
with XML Catalogs;
Populate the cache. “Caching proxies rely on the fact that you can access
the resource at least once from the web.”. wwwoffle does this, but a better caching
system need not. When I talked about the need for operating systems to be in on
caching (and later with the Save-As idea), what I had in mind was treating the
computer’s storage as a structured store (remember
Bento?), such that
any content would hit the disk “named” with its URI. This would permit
the software that Norm installs to include with it a representation of this
resource (schema or whatever), named with its one true URI, and available to any
app on that machine. Again, no new standards required.
In that same section, Norm says that sometimes the URI may never be
directly resolvable. That is definitely a possibility, but again, this same
mechanism of tightly associating the URI-as-name with the data, makes that
mostly moot; it doesn’t matter where the data comes from (modulo trust) if
Access Development Resources. Yah, what
Devise Your Own Resolution Policies. I think your comment about
public identifiers is relevant here; if they were used, this wouldn’t be an
issue, and caching would be useful.
But while I maintain that better technology can do what Norm needs, I’m
not saying that no standardization was necessary. Due to the fact that the
technology is not there to do what is needed, plus the extent to
which that technology needs to pretty much be pervasively integrated into
OSs, standardizing on XML Catalogs may very well have been the best option.
But something tells me that the decision to standardize was made without
knowledge that a technical solution existed. No biggie, just pointing
that out 8-).