Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
The Internet

Is Dedicated Hosting for Critical DTDs Necessary? 140

pcause asks: "Recently there was a glitch, when someone at Netscape took down a page that had an important DTD (for RSS), used by many applications and services. This got me thinking that many or all of the important DTDs that software and commerce depend on are hosted at various commercial entities. Is this a sane way to build an XML based Internet infrastructure? Companies come and go all of the time; this means that the storage and availability of those DTDs is in constant jeopardy. It strikes me that we need an infrastructure akin to the root server structure to hold the key DTDs that are used throughout the industry. What organization would be the likely custodian of such data, and what would be the best way to insure such an infrastructure stays funded?"
This discussion has been archived. No new comments can be posted.

Is Dedicated Hosting for Critical DTDs Necessary?

Comments Filter:
  • Centralization (Score:5, Insightful)

    by ushering05401 ( 1086795 ) on Thursday May 17, 2007 @06:09PM (#19170725) Journal
    Nothing too insightful to write, but worth saying in today's volatile political climate. Centralization makes me nervous.

    Regards.
  • w3c (Score:5, Insightful)

    by partenon ( 749418 ) * on Thursday May 17, 2007 @06:09PM (#19170729) Homepage
    w3c.org [w3c.org] . There's no better place to keep the standards related to the web.
  • DTD? (Score:4, Insightful)

    by mastershake_phd ( 1050150 ) on Thursday May 17, 2007 @06:10PM (#19170755) Homepage
    and DTD stands for? Distributed Technical Dependency?
  • by Kjella ( 173770 ) on Thursday May 17, 2007 @06:11PM (#19170785) Homepage
    ...keep a copy, host it on your own site and reference that instead. There was no problem except that some were using that file to download the definitions. Or just expand the definition to include a checksum and a list of mirrors. Is this even a problem worth solving? I mean except for the slashdot post it seemed to me like this went by without anyone noticing.
  • by Anonymous Coward on Thursday May 17, 2007 @06:18PM (#19170879)
    Such a system should also allow stable storage and management of ontology definitions, used within the semantic web.

    I would suggest someone like OSTG or the Mozilla foundation...
  • Sane? (Score:5, Insightful)

    by DogDude ( 805747 ) on Thursday May 17, 2007 @06:25PM (#19171009)
    Well, I wouldn't call it sane if anybody who is actively using XML and needs a DTD isn't hosting it right along with whatever web site they're using the XML for. Relying on somebody else to maintain a critical DTD that you use isn't sane. It's pretty dumb.
  • No (Score:5, Insightful)

    by Bogtha ( 906264 ) on Thursday May 17, 2007 @06:25PM (#19171023)

    You shouldn't be using DTDs any more. Validation is better achieved with RelaxNG, and you shouldn't use them for entity references because then non-validating parsers won't be able to handle your code.

    For those document types that already use DTDs, either you ship the DTDs with your application, or you cache them the first time you parse a document of that type.

    The Netscape DTD issue was caused not by the DTD being unavailable, but by some client applications not being sufficiently robust. You shouldn't be looking at the hosting to solve the problem.

  • Don't use them (Score:5, Insightful)

    by Anonymous Coward on Thursday May 17, 2007 @06:35PM (#19171191)
    If the absence of these files will break your app or service, then you need to make your app or service more robust.

    Sure, DTD files are necessary for development. If your app requires that they be used to validate something in real time each time it is comes in from a client or whatever, then use an internal copy of the version of the DTD file that you support. If the host makes a change to it (or drops it, or lets it get hacked), your app won't break, and you can decide when you will implement and support that change.

    I really don't see what is gained by making the real time operation of your application dependent on the availability and pristinity of remotely and independently hosted files. It just makes you fragile, and you can get all the benefits you need from just checking the files during your maintenance and development cycles.

  • by Zocalo ( 252965 ) on Thursday May 17, 2007 @06:39PM (#19171279) Homepage

    NTP.org" [ntp.org] maintains a pool of public NTP servers that are accessible via the hostname "pool.ntp.org", so perhaps something similar would work for a global DTD repository. An industry organization with a vested interest, the W3C seems like the most logical, could maintain the DNS zone and organizations could volunteer some server space and bandwidth to host a mirror of the collected pool of DTDs. Volunteering organizations might come and go, but when that happens it's just a matter of updating the DNS zone to reflect the change and everyone using DTDs just needs to know a single generic hostname will always provide a copy of the required DTD.

    Just a thought...

  • URI vs URL (Score:5, Insightful)

    by Sparr0 ( 451780 ) <sparr0@gmail.com> on Thursday May 17, 2007 @06:41PM (#19171315) Homepage Journal
    A key mistake in your assumptions was brought up when the Netscape fiasco was news, and I will bring it up again...

    "http://my.netscape.com/publish/formats/rss-0.91.d td" is a URI. It uniquely identifies a file. It *HAPPENS* to also be the URL for that same file, for now, but that is just a fortunate intentional coincidence. Your software should not rely on or require the file to be located at that URL. /var/dtd/rss-0.91.dtd is a perfectly valid location for the file identified by the URI "[whatever]/rss-0.91.dtd". What we need is for XML-using-software authors to support and embrace local DTD caches, AND package DTDs along with their applications (with the possibility of updating them from the web if neccessary).

    It is silly that millions of RSS readers fetch a non-changing file from the same web site every day. It is only very slightly less silly that they fetch it from the web at all.
  • Re:No (Score:3, Insightful)

    by Anonymous Coward on Thursday May 17, 2007 @06:45PM (#19171377)
    The Netscape DTD issue was caused not by the DTD being unavailable, but by some client applications not being sufficiently robust.

    Not sufficiently robust is an understatement. ****ing stupid is what I would call it. If every browser had to hit the W3C site for the HTML DTDs every time they loaded a web page, the web would collapse.

  • Re:I know! (Score:3, Insightful)

    by mollymoo ( 202721 ) on Thursday May 17, 2007 @07:44PM (#19172215) Journal

    The point was that repling on a single entity isn't a good idea. Google is a single company, The Internet Archive is a single organisation.

    I'd suggest something more along the lines of DNS, where although there would be a single ultimate authority, the day-to-day business of serving DTDs would be distributed and handled by multiple levels of servers.

  • short answer: no (Score:4, Insightful)

    by coaxial ( 28297 ) on Thursday May 17, 2007 @08:03PM (#19172423) Homepage
    Validation is overrated. Especially, when it comes to RSS. There's so many competing "compatable" standards, that really aren't. feedparser.org [feedparser.org] has a great write up about the state of RSS. It's pathetic.

    If you're reading a doc, don't bother validating it. You're probably going to have handle "invalid" XML anyway. When you're constructing XML, you should write it according to the DTD, but if you're relying on a remote site, then you're asking for trouble. Just cache the version locally, but seriously, you're tool shouldn't really need it. You're engineers do, but not the tool.

    Finally, it's trivial to reconstruct a dtd from sample documents.
  • EXACTLY (Score:5, Insightful)

    by wowbagger ( 69688 ) on Thursday May 17, 2007 @08:03PM (#19172435) Homepage Journal
    Exactly right, but it is even worse than that:

    A DTD spec SHOULD have both a PUBLIC identifier and a SYSTEM identifier. The system identifier is strongly recommended to be a URL so that a validating parser can fetch the DTD if the DTD is not found in the system catalog.

    The system catalog is supposed to map from the PUBLIC identifier to a local file, so that the parser needn't go to the network.

    If you are running a recent vintage Linux, look in /etc/xml/ - there are all the catalog maps for all the various DTDs in use.

    So:
    1. The application writers SHOULD have added the DTDs to the local system's catalog.
    2. Failing that, the application SHOULD have cached the DTD locally the first time it was fetched, and never fetched it again.


  • Re:Sane? (Score:2, Insightful)

    by DogDude ( 805747 ) on Thursday May 17, 2007 @08:07PM (#19172475)
    Well, even if you're not, then you should absolutely, positively, and without any doubt, at least in my mind, have a copy of all of your DTD's.
  • Re:Sane? (Score:3, Insightful)

    by curunir ( 98273 ) * on Thursday May 17, 2007 @08:26PM (#19172687) Homepage Journal
    Exactly. If you write an application that requires a DTD (or XSD for that matter) to parse an XML document, include that file as part of the software. The XML processing code should intercept entity references and load them from the local copy. Not only does this make your application more reliable, it also makes it faster.

    Public hosting of schema documents should not be for application use where the application knows ahead of time what kind of document it will be parsing (like the RSS situation). In all likelihood, a change to that schema document will cause an error in the XML parsing anyway, since the parser isn't expecting new or changed elements.

    Public hosting of documents should be reserved for editors that create XML documents that must comply with a given format. This allows XML authors to validate their documents against the schema, but nothing breaks when the publicly-hosted document becomes unavailable.
  • by liothen ( 866548 ) on Thursday May 17, 2007 @09:22PM (#19173195) Homepage
    Why doesnt the content provider just provide the dtd. Why have to worry about caching it or random errors poping up in it, when the DTD can be stored on the very same server as the website, or stored with the application. Then it doesnt matter if another company screws up or if some miliscious hacker decideds to attack the DTD it doesnt effect your product...
      Some might think well what if it changes?
    well its obvious download the new one update your xhtml/xml or application to the specific changes.

  • Re:I know! (Score:2, Insightful)

    by UltraAyla ( 828879 ) on Thursday May 17, 2007 @11:27PM (#19174307) Homepage
    I think you're right on the money here. DNS-like was my first thought as well. Have a root system where all updates are made, then have organizations which check for updates to a package of multiple critical DTDs on a weekly or monthly basis or something. Then people can have a list of DTD sources in the event that one goes down (though I'm pretty sure XML only supports one DTD in each document - someone correct me if I'm wrong). This would reduce the burden on any one person, allow organizations to manage their DTDs on their own if they like, etc.
  • by knorthern knight ( 513660 ) on Friday May 18, 2007 @12:58AM (#19174947)
    1) There are some sensitive environments (military, etc) where you simply do *NOT* connect your internal network to "teh interweb". No ifs, ands, ors, buts. The result is a broken browser where the DTD's are required.

    2) Remember the incident where popular "safe" Superbowl sites were compromised and laced with malware-installing code? What happens to millions of Firefox-on-Windows users when a bunch of Russian mobsters or Chinese government agents hijack a DTD host and load it with a zero-day Windows exploit?

    3) Remember "pharming", where DNS servers are hijacked to redirect *CORRECTLY TYPED URLS* to malware-infested sites. Even if the bad-guys can't hijack the DTD host, they can still hijack Windows-based DNS servers (ptui!) and anybody who relies on them gets redirected to a malware-install site.

    That's the problem; here's my solution. It's composed of two parts.

    A) DTDs will be *LOCAL FILES ON YOUR WORKSTATION* (excepting "thin clients").

    B) Browsers (or possibly Operating Systems) will include new DTDs with updates. In posix OS's (*NIX, BSD) DTDs will be stored in /etc/dtd/ and users will be able to add their own DTDs in ~/.dtd
    Windows will have its own locations. When you get your regular update for your browser (or alternatively, your OS), part of the update will be any new DTDs. There will be a separate file for each DTD and version, so that your browser can properly handle multiple tabs opening to sites using different versions of the same base DTD.
  • Re:Catalog files? (Score:5, Insightful)

    by EsbenMoseHansen ( 731150 ) on Friday May 18, 2007 @01:57AM (#19175277) Homepage

    Or better yet why can't you just copy the blasted thing to your own site if your going to use it?

    Is there some technical reason I'm not aware of that means it has to stay somewhere central?

    There shouldn't be, yet I would be greatly surprised if some application didn't match on the entire DTD string, hostname and all.

    I am equally baffled at what applications need the DTD for anyway. Except for generic XML applications, what use is a DTD? Most applications only handles a fixed few XML document types anyway.

    Finally, if they really need that DTD... any distro have most major DTDs available. No reason why they couldn't carry a few extra. Should be easy to just search for them locally.

Always draw your curves, then plot your reading.

Working...