Is Dedicated Hosting for Critical DTDs Necessary? 140
pcause asks: "Recently there was a glitch, when someone at Netscape took down a page that had an important DTD (for RSS), used by many applications and services. This got me thinking that many or all of the important DTDs that software and commerce depend on are hosted at various commercial entities. Is this a sane way to build an XML based Internet infrastructure? Companies come and go all of the time; this means that the storage and availability of those DTDs is in constant jeopardy. It strikes me that we need an infrastructure akin to the root server structure to hold the key DTDs that are used throughout the industry. What organization would be the likely custodian of such data, and what would be the best way to insure such an infrastructure stays funded?"
Centralization (Score:5, Insightful)
Regards.
w3c (Score:5, Insightful)
DTD? (Score:4, Insightful)
In case of death... (Score:5, Insightful)
Not only DTDs, but also ontology definitions (Score:1, Insightful)
I would suggest someone like OSTG or the Mozilla foundation...
Sane? (Score:5, Insightful)
No (Score:5, Insightful)
You shouldn't be using DTDs any more. Validation is better achieved with RelaxNG, and you shouldn't use them for entity references because then non-validating parsers won't be able to handle your code.
For those document types that already use DTDs, either you ship the DTDs with your application, or you cache them the first time you parse a document of that type.
The Netscape DTD issue was caused not by the DTD being unavailable, but by some client applications not being sufficiently robust. You shouldn't be looking at the hosting to solve the problem.
Don't use them (Score:5, Insightful)
Sure, DTD files are necessary for development. If your app requires that they be used to validate something in real time each time it is comes in from a client or whatever, then use an internal copy of the version of the DTD file that you support. If the host makes a change to it (or drops it, or lets it get hacked), your app won't break, and you can decide when you will implement and support that change.
I really don't see what is gained by making the real time operation of your application dependent on the availability and pristinity of remotely and independently hosted files. It just makes you fragile, and you can get all the benefits you need from just checking the files during your maintenance and development cycles.
Perhaps something like "pool.ntp.org"? (Score:5, Insightful)
NTP.org" [ntp.org] maintains a pool of public NTP servers that are accessible via the hostname "pool.ntp.org", so perhaps something similar would work for a global DTD repository. An industry organization with a vested interest, the W3C seems like the most logical, could maintain the DNS zone and organizations could volunteer some server space and bandwidth to host a mirror of the collected pool of DTDs. Volunteering organizations might come and go, but when that happens it's just a matter of updating the DNS zone to reflect the change and everyone using DTDs just needs to know a single generic hostname will always provide a copy of the required DTD.
Just a thought...
URI vs URL (Score:5, Insightful)
"http://my.netscape.com/publish/formats/rss-0.91.
It is silly that millions of RSS readers fetch a non-changing file from the same web site every day. It is only very slightly less silly that they fetch it from the web at all.
Re:No (Score:3, Insightful)
Not sufficiently robust is an understatement. ****ing stupid is what I would call it. If every browser had to hit the W3C site for the HTML DTDs every time they loaded a web page, the web would collapse.
Re:I know! (Score:3, Insightful)
The point was that repling on a single entity isn't a good idea. Google is a single company, The Internet Archive is a single organisation.
I'd suggest something more along the lines of DNS, where although there would be a single ultimate authority, the day-to-day business of serving DTDs would be distributed and handled by multiple levels of servers.
short answer: no (Score:4, Insightful)
If you're reading a doc, don't bother validating it. You're probably going to have handle "invalid" XML anyway. When you're constructing XML, you should write it according to the DTD, but if you're relying on a remote site, then you're asking for trouble. Just cache the version locally, but seriously, you're tool shouldn't really need it. You're engineers do, but not the tool.
Finally, it's trivial to reconstruct a dtd from sample documents.
EXACTLY (Score:5, Insightful)
A DTD spec SHOULD have both a PUBLIC identifier and a SYSTEM identifier. The system identifier is strongly recommended to be a URL so that a validating parser can fetch the DTD if the DTD is not found in the system catalog.
The system catalog is supposed to map from the PUBLIC identifier to a local file, so that the parser needn't go to the network.
If you are running a recent vintage Linux, look in
So:
Re:Sane? (Score:2, Insightful)
Re:Sane? (Score:3, Insightful)
Public hosting of schema documents should not be for application use where the application knows ahead of time what kind of document it will be parsing (like the RSS situation). In all likelihood, a change to that schema document will cause an error in the XML parsing anyway, since the parser isn't expecting new or changed elements.
Public hosting of documents should be reserved for editors that create XML documents that must comply with a given format. This allows XML authors to validate their documents against the schema, but nothing breaks when the publicly-hosted document becomes unavailable.
DTD Critical Hosting (Score:2, Insightful)
Some might think well what if it changes?
well its obvious download the new one update your xhtml/xml or application to the specific changes.
Re:I know! (Score:2, Insightful)
The security implications are extremely ugly (Score:3, Insightful)
2) Remember the incident where popular "safe" Superbowl sites were compromised and laced with malware-installing code? What happens to millions of Firefox-on-Windows users when a bunch of Russian mobsters or Chinese government agents hijack a DTD host and load it with a zero-day Windows exploit?
3) Remember "pharming", where DNS servers are hijacked to redirect *CORRECTLY TYPED URLS* to malware-infested sites. Even if the bad-guys can't hijack the DTD host, they can still hijack Windows-based DNS servers (ptui!) and anybody who relies on them gets redirected to a malware-install site.
That's the problem; here's my solution. It's composed of two parts.
A) DTDs will be *LOCAL FILES ON YOUR WORKSTATION* (excepting "thin clients").
B) Browsers (or possibly Operating Systems) will include new DTDs with updates. In posix OS's (*NIX, BSD) DTDs will be stored in
Windows will have its own locations. When you get your regular update for your browser (or alternatively, your OS), part of the update will be any new DTDs. There will be a separate file for each DTD and version, so that your browser can properly handle multiple tabs opening to sites using different versions of the same base DTD.
Re:Catalog files? (Score:5, Insightful)
Is there some technical reason I'm not aware of that means it has to stay somewhere central?
There shouldn't be, yet I would be greatly surprised if some application didn't match on the entire DTD string, hostname and all.
I am equally baffled at what applications need the DTD for anyway. Except for generic XML applications, what use is a DTD? Most applications only handles a fixed few XML document types anyway.
Finally, if they really need that DTD... any distro have most major DTDs available. No reason why they couldn't carry a few extra. Should be easy to just search for them locally.