Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Programming Software IT Technology

Great Computer Science Papers? 410

slevin writes "Recently I listened to a talk by Alan Kay who mentioned that many 'new' software ideas had already been discovered decades earlier by computer scientists - but 'nobody reads these great papers anymore.' Over the years I have had the opportunity to read some really great and thought-provoking academic papers in Computer Science and would like to read more, but there are just too many to sort through. I'm wondering what great or seminal papers others have encountered. Since Google has no answers, perhaps we can come up with a list for the rest of the world?"
This discussion has been archived. No new comments can be posted.

Great Computer Science Papers?

Comments Filter:
  • Nay, archetypal... (Score:5, Interesting)

    by Empiric ( 675968 ) * on Sunday November 16, 2003 @08:41AM (#7486599)
    For "great and seminal" it's hard to beat Alan Turing's 1950 (!) paper on AI [loebner.net].
  • Classic papers (Score:5, Interesting)

    by thvv ( 92519 ) on Sunday November 16, 2003 @08:55AM (#7486628) Homepage
    "The UNIX Time-Sharing System," by Dennis Ritchie & Ken Thompson, is one of the best-written papers ever. The elegance of thought and economy of description set a standard we should all aspire to.
    http://cm.bell-labs.com/cm/cs/who/dmr/cacm.ht ml

    I list several more classics on my "Software Engineering Reading List" page at
    http://www.multicians.org/thvv/swe-readings.ht ml
  • by cperciva ( 102828 ) on Sunday November 16, 2003 @08:58AM (#7486637) Homepage
    If nobody reads those "great old papers" any more, there's probably a reason. Sometimes the ideas have been superceeded; sometimes they weren't any good to begin with; often the papers are simply really hard to understand. The fact that people seriously suggest reading "great papers" reflects on the immaturity of the field; in a field like mathematics, hardly anyone ever reads the original papers (even for work done in the 20th century), instead opting to read someone else's simplification/clarification of the ideas.

    We speak of the TAoCP as "the bible", but I'm not sure if there are any "new" ideas there; rather, the value of TAoCP is as a compilation and exposition of all the best ideas other people have produced.

    Learn about great algorithms; don't worry about reading great papers.
  • by Multics ( 45254 ) on Sunday November 16, 2003 @08:59AM (#7486644) Journal
    I ponder if we made a list of oh say 'n' of these if the typical /.er would read them.

    I've taught computer science. Specifically Software Engineering where there is about a 1" thick stack of around 15 papers that get the whole idea. Wonderful works like "Goto Considered Harmful" (Communications of the ACM, 11, p147-148, 1968) come to mind. But I don't think there's much hope the typical /.er will take the time and effort to read them better yet think about them.

    In the last couple of weeks /. as a culture came up as a lunch conversation between my co-workers and I. We came to the conclusion that the wild herd doesn't pay for stuff (Kazaa, Morphious, etc), is ADD (how many times have you read a posting where the poster hadn't read the link?) and generally thinks that education is mostly worthless (the bi-annual do I need a degree grudge match). Given these behaviors, why go through the effort of making a list?

    If I were working this space (putting my teaching hat back on) I'd cover:

    Computer Architecture (where all things come from)

    Theory of Computing including O() [& friends], analysis of algs, Turing, etc.

    Software Engineering

    Software Testing

    Graphics

    Databases

    Numerical Methods

    Simulation (& Statistics)
    and

    Systems Analysis (where apparently all books currently suck)

    I think that would be the place to start and there would be more than 10 or 20 of them.

    -- Multics

  • by Space cowboy ( 13680 ) on Sunday November 16, 2003 @09:02AM (#7486651) Journal
    Alan Turing was a genius, pure and simple.

    His crypto work during the war was massively significant in winning the battle of the Atlantic, his ideas on programming, AI, neural networks, and the more-public "turing test" were breathtaking and groundbreaking. Less well known is his theory of non-linear biology, and some exceptional papers in physics. A modern version of the renaissance scientist, the michaelangelo of his day.

    The hounding of him (because he was gay), arrest, loss of clearance, and subsequent suicide by cyanide in '54 was a shameful treatment of one of the most brilliant men in science this century.

    Simon.
  • by Anonymous Coward on Sunday November 16, 2003 @09:17AM (#7486684)
    I'd love to see some good solid works on UI in there too.

    UI not just as "how a GUI widget should work" but everything involving human/computer interaction, from really simple basics like "where to sit" up to any kind of abstract concept with regards to how a machine and a human can get information to/from each other most usefully
  • by A. Brate ( 588407 ) on Sunday November 16, 2003 @09:17AM (#7486685) Homepage Journal
    This is shameless self-promotion, but you should read my book [amazon.com]!

    Technomanifestos discusses the truly thought-provoking, inspirational, seminal computer papers of the 20th century [technomanifestos.net], from Turing's "On Computable Numbers" and "Computing Machinery and Intelligence", to Alan Kay's "Personal Dynamic Media" to Larry Wall's States of the Perl Onion.

    The book delves into the historical, biographical, and scientific context of works such as these and follows the thread of inspiration to today's world. If you want to know where the Internet germinated, or how Marshall McLuhan and Pierre de Chardin influenced the World Wide Web (or even who McLuhan and de Chardin are!) you should pick up my book. And then read it.

    Technomanifestos tracks the evolution of the MIT hacker, from the dapper Boston Brahmin Vannevar Bush to the famously unkempt Richard Stallman, and introduces the cast of lesser-known (to the non-Slashdot world) but crucially inventive individuals such as Ivan Sutherland and Seymour Papert.

    Moreover, it discusses how the truly great computing ideas come from people who recognize that technology, especially information technology, has the power to transform people and society--these are (in the words of similarly great books) tools for thought [amazon.com] and dream machines [amazon.com].

    Or if you have no interest in helping me pay my DSL bill, you can go straight to the sources [technomanifestos.net], many of which are available online.

  • by bj8rn ( 583532 ) on Sunday November 16, 2003 @09:35AM (#7486739)
    There's another way of looking at this. In every sphere of culture (including science), there's a constant variation between explosive (revolutionary) and stable (evolutionary) development. In the phase of stable development, the ideas that came before are used, everything seems to be more anonymous; numerous writers and scientists may be known in their own circle, but forgotten quite soon. In the phase of explosion, revolutionary ideas are born and the Author, the genius is more important. So, the Author may be dying because there are no new ideas, but (s)he will rise again one day.

    (ideas borrowed from Thomas Kuhn and Yuri Lotman)

  • Donald E. Knuth (Score:5, Interesting)

    by roffe ( 26714 ) <roffe@extern.uio.no> on Sunday November 16, 2003 @09:44AM (#7486771) Homepage

    Donald Knuth has written a lot of interesting papers, but his paper on TeXs line-breaking algoritm

    • Defines the state of the art in digital typesetting
    • Is a textbook example of how a scientific paper should be written: it outlines the history of the problem, gives historical and current examples, defines the problem statement and discusses the suggested solution.

    and as far as I know, the algoritm is still state of the art and is used only by TeX, InDesign and an addition to QuarkXPress.

  • by dido ( 9125 ) <dido AT imperium DOT ph> on Sunday November 16, 2003 @09:57AM (#7486808)

    "On computable numbers, with an application to the Entscheidungsproblem" [abelard.org]" is unarguably the paper that began the field of computer science as we understand it today. Here we have the first descriptions of universal computing devices, Turing machines, which eventually led to the idea of universal stored-program digital computers. The paper even seems to describe, in what is unarguably the first ever conceptual programming language, a form of continuation passing style [ic.ac.uk] in the form of the "skeleton tables" Turing used to abbreviate his Turing machine designs. It's also relatively easy reading compared to many other scientific papers I've seen.

    Along with this we might also include Alonzo Church's 1941 paper "The Calculi of Lambda Abstraction" (which sadly does not appear to be anywhere online), where the lambda calculus, the basis for all functional programming languages, is first described.

  • by crawling_chaos ( 23007 ) on Sunday November 16, 2003 @10:19AM (#7486882) Homepage
    While Turing's contributions to breaking Enigma were valuable, as the years slide on we find that his contributions may have been overstated to cover up other covert operations. Try reading Seizing the Enigma by David Kahn (of Codebreakers fame). It appears that the Enigma was also solved by some covert ops that seized monthly settings documents from Nazi weather ships and surrendered U-boats. For most of the war, hints like these were needed to get anything resembling real time Ultra, even with the bombes cranking away at full speed.

    Interestingly enough, the Luftwaffe was very careful with its settings documents and its discipline for changing rotors. Bletchley Park never solved the Luftwaffe version of Enigma.

    None of this should detract from Turing's greatness as a mathematician, but it appears that the British used his reputation to hide a few other facts. No need to alert your enemies to all of your methods, after all.

  • by orthogonal ( 588627 ) on Sunday November 16, 2003 @10:53AM (#7487038) Journal
    You forgot to quote

    We came to the conclusion that the wild herd doesn't pay for stuff ...and then proceed to suggest that he offer what is basically a college CS course for free.


    Good point.

    I suppose I could counter with "Doesn't he use any open source software? Think of the course as giving back for the kernel" or something, but that would be disingenuous.

    But I do have an idea about payment.

    Unfortunately, my idea won't put any money in his pocket. (No, it doesn't involve collecting underpants, either.)

    It's more a pay it forward type idea: train people, and then send them forth to train others.

    It's a good model for accelerating a meme, but perhaps too envangelistic for the Intellectual Property world.

    I'd like to set up some form of co-operative education, where small and easily learned skills (not as complex as what our OP proposes to teach) are taught to small groups, with each learner undertaking to teach another small class to pay his "tuition".

    I think there are some advantages to this model, not the least being that the best way to learn something -- to really learn it and make it part of yourself -- is to teach it.

    There are some problems with it too, but I've got a sketch of some ideas to overcome of the obvious problems.

    Of course, it's not a new idea: it's how cults have always spread. The question is, can it propel mundane learning as well as it propels sacred ideas.

    But it's not really a payment scheme. It won't pay the grocery bill.

    Maybe we should sell tickets?
  • by Frodo2002 ( 595920 ) on Sunday November 16, 2003 @11:03AM (#7487076) Homepage

    I guess I have to challenge this one too. Of course ideas are superceeded or improved upon. Understanding is refined as the field matures... But here is an argument why you should do exactly the opposite to what you suggest:

    The historical development of ideas, from their first suggestion to their eventual refinement, represents a natural progression in human understanding and cognition. When you try to short-cut that cognitive development you are invariably left with weak, poorly formed ideas. Great old papers should be read so that you can gain insight into this development of ideas and it may help you understand things much better than before.

    This claim is difficult to back up with any sort of scientific test. As some evidence, one field of education (physics education) specialises in short-cutting the historical development of ideas and as we in the field know, teaching physics is a spectacular failure (though some would deny it). As a personal piece of evidence (does not count for much, but I don't have any other evidence at hand), I can say I never really felt entirely comfortable with Schrodinger's equation and its probabilistic interpretation until I went back and read Schrodinger's and Born's original papers. That is when I realised that Schrodinger's wave equation describes a wave in configuration space. Also, his subsequent fights with Bohr, where he tried to defend a matter wave interpretation of the wave function, reveal much about the type of ontological misclassification which humans fall into. Now isn't that amazing? Schrodinger spent a lot of time trying to defend an ontological standpoint that the wave function represented a material wave even though he was the person who derived the wave equation and should have known better. Is it any wonder then that my students, who don't even really understand where the wave function and wave equation come from, think that the wave function represents a material wave? I would have had none of this insight without reading the original papers.

  • by CAIMLAS ( 41445 ) on Sunday November 16, 2003 @11:30AM (#7487195)
    You sir, are ignorant to assume that degree == (education || learning). A degree is a piece of paper from an institution that indicates that you are not a complete bafoon, and that you are employable. Nothing more. I've considered the validity of my desire to finish school every semester - and before that, every year of high school. However, do I hold education in low regard? No. Quite the contrary - to the extent that I have been continually lauded by teachers, peers, professors, and others for my knowledge, insight, and whatever else they seem deemable of esteem.

    My gripe with the education system - and particularly higher institutions of learning, which should know better - is that they dumb the stuff down for the least common denominator, can't think of an interesting way to teach it for the life of them, and for the most part, hardly know the subject themselves (at least in an applicable manner). The most underlying problem, though, is that most teachers (or professors) are not students themselves: learning, curiosity, and problem solving aren't terribly interesting to them anymore.

    I've run into CS professors that couldn't program. I've run into (an almost-tenured) English professor that wouldn't know good writing if it ripped their face off, let alone proper grammar. I've run into people with masters in communication that have no knowledge of the history of various industries, let alone modern methods used. The list goes on, but I need to stop before I get too frustrated.

    Who gets the blame? The professors, surely, for not being adept. But the institution that hired them, as well, for hiring retards. The schools that gave these people their degrees and doctorates in the first place, as well. How about their secondary schooling? That's at fault as well, for not teaching them (at least) how to think critically (deductive logic) and learn on their own. I'd partially blame the plethora of students that tend to go to 4-year schools for education nowadays: it's turned your average university into simply a festering wound for these magots to crawl around in, get drunk, etc. - as opposed to an institution of higher learning that has prestigous requirements, schools their students well, and turns out a very high percentage of leaders. The excess of required programs that students are required to take are utter garbage, things that should have been learned in high school (first and second year english spring to mind).

    Needless to say, you've stepped on a very sore foot. I don't contest the things you've said about the list of things to go on "the list", as they seem fairly on target to me. It was simply that one statement that set me aflame.
  • by kfg ( 145172 ) on Sunday November 16, 2003 @11:44AM (#7487272)
    "No translation would suffice: Stone felt that only by reading the original text for himself could he arrive at the insight he desired.

    Precisely the point. The exercise is also teaching me a tremendous amount about written language in general and thus English, so the exercise is even currently relevant. My age isn't quite so advanced as Stone's, so I feel a bit free to take the slow road and examine the development of the Greek alphabet from the Phoenician along the way.

    I find this particular bit from the interview you link to rather pertinent to the current topic:

    Isn't that pretty far from home base, from current concerns and difficulties?

    Not really. All our basic problems are there in miniature.


    When we stop reading the great old papers we lose our history. When we lose our history we lose a measure of our understanding as well. You can't properly understand where you are unless you understand how you came to be there.

    KFG
  • You touch on the topic of information overload. There is so much new information now, not to mention the amount being generated every day, that people don't have a good grasp of them. Some social scientist once remarked how most "new" works are not really original. A lot of papers, theses, books, articles, magazines, etc are published every year, yet many of them are repeating stuff and inadvertently cover stuff that were already done. For instance, the number of unique scientific papers being published now is a lower percentage than 100 years ago, but the number of papers published now is several magnitudes larger.

    Ironically, if people actually spent time using old stuff, it wouldn't necessarily be better. For instance, people will probably spend more time searching for existing stuff than generating ideas.

    Perhaps the decline of science will be precipitated by this...

    Sivaram Velauthapillai
  • by handy_vandal ( 606174 ) on Sunday November 16, 2003 @12:06PM (#7487384) Homepage Journal
    All our basic problems are there in [classical Athens] miniature. -- I.F. Stone

    Exactly. The soul of man has not changed since the classical world.

    Good, evil, right and wrong, kindness and cruelty, peace and war -- details may change, interpretations may change, certainly the technologies change ... but in terms of our humanity, we are fundamentally the same as our ancestors.

    There is a terrible temptation -- especially in America, my home country, whose founders saw themselves as the spiritual successors to the democratic principles of classical Athens -- to view one's own country as "better" than the rest of the world. Indeed, there is a terrible temptation to view oneself as "better" than the rest of mankind. But a reading of history says otherwise. We are neither better nor worse than our ancestors: we are surprisingly like them.

    -kgj
  • Costs too much (Score:5, Interesting)

    by Tangurena ( 576827 ) on Sunday November 16, 2003 @02:11PM (#7488098)
    I used to be members of both societies. Annual student (I spent several years working on my masters) dues to obtain the magazines and journals from each one that interested me cost around $200 per year for each society, and a lot more for non-student dues. No company I have worked for in the last 10 years has been willing to underwrite professional society memberships, even though the written policies claim that they will.

    A recent short job assignment at HP let me run amok through the online libraries of both IEEE and ACM. It was interesting to see published articles from 5-10 years ago that directly covered topics that were the hot issues in the office today. Looking at the issues that were hot topics in the last few companies over the past 2 years, I saw the same pattern of scholarly articles being about 5-10 years ahead of the industry.

    While working in medium to larger companies, I would find the number of people who did not even understand simple concepts of Computer Science frightening.

    I am curious as to how much effort is wasted reinventing the wheel. I know a lot, because as a programmer on death march projects, I rarely have the hours to devote to finding how other people solved the same problem 5-30 years ago. That pointy haired boss breathing down my back thinks that any time not spent slaving over a hot keyboard is a waste of time. As the old saying: it is hard to remember the job is to drain the swamp when you are up to your armpits wrestling with gators. No amount of showing that spending a few hours sharpening the saw each week could save far more time that what appeared to be wasted. One past job allowed some time to be billed to research each week until some phb wandered by to bitch about it. It was the appearance of goofing off reading that made the boss look worse than the schedule slipping. And appearances appear to be more important in today's economy than actual results.

  • by WNight ( 23683 ) on Sunday November 16, 2003 @02:44PM (#7488260) Homepage
    Sometimes what you say is true, that there are insights in the originals that have been lost. Other times they're just old.

    If the original is like this Codd you mention, where he makes a science out of something and other people distill it for popular reading, then yes, reading the original is likely to teach you something.

    But if the original is scientific, as are all of the books that build upon it, you're not likely going to learn a lot more about the state of the art today. You'll learn what it was like then, but nothing will really have been lost and indeed, any of the old mistakes are likely to be corrected.

    Scientific works are expected to make sense at every step. It's not like the game of telephone the bible went through - where you need to go back to an original to find out what they meant by unicorns. Further, science usually gets expanded at each step, where the literature gets translated, inevitably losing something, but doesn't get anything except new notes.

    So for literature, you need to go back to the originals at every step, and for scientific works, assuming their assumptions proved to be true, you can usually build on the previous generation.

    Not that going back to basics is bad, it provides a reality check, but it's not as necessary.
  • Primes is in P (Score:2, Interesting)

    by CodeBuster ( 516420 ) on Monday November 17, 2003 @12:31AM (#7491179)
    Prof. Manindra Agarwal and two of his students, Nitin Saxena and Neeraj Kayal (both BTech from CSE/IITK who have just joined as Ph.D. students), have discovered a polynomial time deterministic algorithm to test if an input number is prime or not. Lots of people over (literally!) centuries have been looking for a polynomial time test for primality, and this result is a major breakthrough, likened by some to the P-time solution to Linear Programming announced in the 70s.

    You may want to add this one to your list....

    Primes is in P [iitk.ac.in]
  • by rpg25 ( 470383 ) on Tuesday November 18, 2003 @04:35PM (#7505007)

    I quit ACM because the only benefit it offered me was the Communications of the ACM. I'm sorry to say it, but the CACM is mostly terrible (as opposed to IEEE Spectrum, which is mostly ok). The CACM has a really bad identity crisis between being for academics and being for practitioners. IMNSHO, it picks a middle ground that makes it of interest to no one.

    I remember what put me over the edge to resigning my membership was this horrible article about the Yin and Yang of Computer Science. That was so bad that I had to check to make sure it wasn't an April Fool's joke. The last thing I need is my professional association publishing Newage (to rhyme with "sewage") twaddle. I mean, what's next, analyzing software by its Zodiac sign?

This file will self-destruct in five minutes.

Working...