Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Education

History and Culture of Computing? 150

living phoenix wrote: "I'm currently working on the feasibility of teaching a class in the history and culture of computing for a collegiate senior thesis project. Basically, I would spend a little over a year studying the subject matter to create enough class materials for 3 hour per week class sessions for 16 weeks. I would like to cover, in very brief terms, the "invention" of zero and the positional number system as it relates to computers, mechanical computers including Babbage's Analytical Engine, analog computing, ENIAC, EDVAC, the Mark Series and the first "bug", the PDP series up through the moderns with shorter stops at the creation of the internet and systems design. This is a massive undertaking for me, in part because I have so much research to do to simply select the points that are best suited for a 16 week course. Has anyone ever taken a course such as this, heard of such a course, or know anyone who has taught the course? Also, I'm making a request for research materials, if you have a text that you thought was intriguing and/or would pertain well with the course objectives please let me know so I can use them in my research." Well, just searching slashdot I see a CNET article and a book review that will help you out. And don't forget the history of Unix. My guess is that there will be too much material - the hard part will selecting what is important to keep in and what can be left out.
This discussion has been archived. No new comments can be posted.

History and Culture of Computing?

Comments Filter:
  • by Anonymous Coward
    And don't forget to cover the history of foobar. This is essential to your course.
  • by Anonymous Coward
    If you're going to spend a year researching this, you should considering putting it into textform or e-text form so you can potentially sell this.

    I would guess many others are interested in this subject (I am, as a grad student and a teacher) and would be interested in taking this course and/or buying the book.

    Good luck, and great idea!
  • by Anonymous Coward
    Must...Follow...Collective.
    Must...Not...Have...Own...Ideas.
    Must...Get...Culture...Prepackaged...From...Book
  • Wow, that pretty much describes the class that I'm taking now at the University of Minnesota. How would you differentiate your class from an ordinary history of computing class? I guess you mentioned the `culture' of computing, which might make it different from what I'm taking. Also, my class is dealing a lot with how the government of USA (and occasionally others) funded quite a few important programs..

    Anyway, I'm taking a class with Arthur Norberg [umn.edu], and we're using 4 books:

    Where Wizards Stay Up Late: The Origins of the Internet by Katie Hafner and Matthew Lyon

    Computer: A History of the Information Machine by Martin Campbell-Kelly and William Aspray

    The Soul of a New Machine by Tracy Kidder

    Transforming Computer Technology: Information Processing for the Pentagon, 1962-1986 by Arthur L. Norberg and Judy E. O'Neill
    --
  • by Wastl ( 809 )
    You might want to have a look into a museum (Really, no joke).

    I don't know whether you have the time and money for the trip, but a very good recommendation (worldwide) is the German Museum of Technologies in Munich (Deutsches Museum [deutsches-museum.de]). They have a very extensive coverage of computer sciences beginning with simple mechanical caluclators like the ones of Leibniz and Pascal over the (working!) Zuse Z3 (a 1945 computer) up to the modern developments of today's microprocessors (but not operating systems or software).

    Very interesting is to take a guide and have it all explained to you. There is also a book available at the museum store, which can possibly be ordered online and in English (Publications (German) [deutsches-museum.de]).

    Possibly there is also a museum about history of sciences somewhere in the U.S., but if you plan a trip to Europe anyway, the German Museum of Technologies is definately worth the time for you.

    Sebastian

  • Don't forget Accidental Empires/Triumph of the Nerds, for the more modern bits. ;)

    les
  • w00ly mammoth wrote:

    Who invented the zero?

    There are a couple of cultures that dispute the invention of zero: The Mayans and the Indians. The Mayans came up with the concept of zero sometime around two thousand five hundred years ago, and used it in their calculations and other symbolic representations. For example, the Mayan origin of time is at 31.August, 3114 BC. This was expressed in Mayan as: 13-0-0-0-0, and could be read as 13 cycles of 400 years ago. This cycle ends on 27.December.2012, where the Mayan calendar will roll back to 1-0-0-0-0.

    Unlike our numeric system, the Mayan system was base 20 (does that mean they began counting with their fingers and toes? Those sandals sure are useful in tropical climates...). Zero is used to express "the beginning before the count".

    It is argued that the number zero was also invented in India and adopted by the Arabs around 1000 AD. Archeological findings (AFAIK -- AINAA), however, place the invention of zero in the hands of the Mayans at least 500 years before.

    Please feel free to correct my ignorance.

    Cheers!

    E
  • For a development so recent, there is considerable controvery over some basic details. For instance, consider GUI. Popular folklore attributes it to Parc, followed by Apple.

    It is even not sure who invented the Macintosh, perhaps it was Jef Raskin [jefraskin.com] perhaps some others (even Jobs) had a good idea or two during its development too.

    It's amazing that with all the people involved in these inventions still alive today, nobody quite agrees on who invented what.

    Jef Raskin is certainly not amused. While I can't judge if he was the true father of the mac, his saying that the truth was far less sexy than the common myth (he a former professor in contrast to some dropped out students) seems reasonable to me. Raskin had the necessary background, and it is more probable to me that the mac was no accident but the result of a longer line of thought.

    It's another matter trying to figure out who invented the first computer.

    Sure, Konrad Zuse [epemag.com] together with his Plankalkül language.

    By the way - that teacher has to tell, that one of the first higher languages, COBOL, was invented by a woman, Admiral Grace Hopper.

  • Last summer, I was looking into designing an interdisciplinary major relating to computers and social science. I found my school has a 500-level course in the History of Computing taught by a Dr. Thomas Bergin. For some more information:

    http://www.csis.american.edu/ [american.edu] - Department web site

    http://www.clark.american.edu/~tbergin/ [american.edu] - Professor's web site

    http://www.csis.american.edu/museum/sloan/html/def ault.htm [american.edu] - A dated web site about a history of computing project sponsored by the Sloan Foundation

    Hope that helps some!
  • If you want a good summary of the culture behind the invention of the internet, I would recommend John Naughton's book "A Brief History of the Future".
  • A good background read is Andrew Hodge's excellent biography of Turing - covers both the life of this founding father of computing and the early british crypto effort (which used the first electical computers, more or less).
  • Everybody here has talked about subject matter, but hasn't mentioned much about actually *how* you present the material. I don't know how much history background you have, or who is supervising you, but you will need input from both CS/EE/IS people (to check your technical accuracy), and history, preferably History and Philosophy of Science, to assist you with presenting things in an informative and thought-provoking manner.

    While how you'd present things is your call, if I were teaching the course I'd very much try to mention the social context of computing history. For instance the influence the space program had on the development of the integrated circuit.

  • I'd reorder things a bit. While the first major use of computers was number crunching, it was number crunching for crytography. WWII code breaking was why the U.K. developed the first digital computers (and what Turing was doing). It was also a major cause of the U.S. work, though I wonder how much was also done for the Manhattan Project, does anyone know?
  • Two thoughts:

    First off, don't neglect the British --- Williams tubes (early CRT memories), index registers, and demand paging from Manchester alone; the first well documented subroutine library, first commercial use of a computer in business (Lyons LEO), and a very influential textbook from the EDSAC group at Cambridge. (It's interesting to note that the word "page" was already used at Manchester for a unit of physical memory block-transferred to backing storage --- magnetic drum --- in Alan Turing's manual [mit.edu] for the commercialized Manchester Mk. I).

    Second, emulators are available for a lot of historical systems --- Bob Supnik has his SIMH suite [tiac.net] of emulators for most of the PDP computers, and a few other early minis from IBM, DG, and so forth. Historical Unix (v5, v6, v7) is generally available and does boot on the PDP-11 emulator. He's still working on the PDP-10, for which see also Tim Stark's ts10 [trailing-edge.com], also in alpha, but already booting TOPS-10; TOPS-20 and ITS are on the todo list. (The annoying thing is that working PDP-10 emulators do exist, but are not available to the public).

    There's a limit to the versimilitude here --- virtual tape never kinks up, the virtual card readers never jam, and the emulators often run an order of magnitude faster than the real machines on modern hardware. But they can still help give student a feel of the environments that people had to deal with thirty and forty years ago.

    • Startup [powells.com], by Jerry Kaplan. A 1994 tale about pen-computing venture Go, and how it went through $75MM in capital before burning up -- as told by the CEO (!).

    • aol.com [powells.com], by Kara Swisher. If you don't want to assign the whole book, the second chapter (about the company's founding by an alcoholic, eccentric, self-destructive entrepreneur) is worth the price of admission alone.

    • Good stuff, but not quite as compelling:
      • Netscape Time [powells.com], by Jim Clark (with Owen Edwards). A fairly human and not-too-smug account of the company's glory years. (c) 1999, when the stock was still worth something.)
      • The Second Coming of Steve Jobs [powells.com], by Alan Deutschman. Written with flair and wit. Really about one specific person, though -- doesn't dwell on bigger insights on the industry.
      • High Noon: The Inside Story of Scott McNealy and the Rise of Sun Microsystems [powells.com], by Karen Southwick. Too adoring. By pure coincidence, I did P.R. work for Sun (after this book was written), and knew some of the folks responsible for making it so... nice.
    • Two books by Peter Salus: A Quarter Century of UNIX [powells.com], and Casting the Net [barnesandnoble.com] (foreword by Vint Cerf!). Excellent for highly technical audiences, next-to-useless for general audiences. Sorry, Peter. (Random fact: I was friends briefly with his daughter when we were in high school. I believe she still works at Linux Magazine.)
    Finally, I would require students to go to the "Vintage Computer Festival [vintage.org] if you're in the Bay Area. There are other useful resources there (such as the mailing list). There's also the Computer History Museum Center [computerhistory.org] in Moffet Field, near Mountain View.

    Good luck with the class! --Tom Geller

  • Check your school's library for the IEEE Annals of the History of Computing, ISSN 1058-6180.
  • There's material for several courses.
    I'd include the following:
    (1) History of the technology- all the way back
    to Babbage, picking up after the WWII with the
    first programmable mainframes, minis, PCs, PDAs,
    and NET.
    Then there is the parallel history of software.
    Concepts of programming, assembler, compilers,
    clients, distributed, databases, games, browsers and so on.
    Proprietary until 1970s, then a mass software market and immense wealth afterwards.
    (2) Engineer/programmer/hacker culture.
    Books by Levy, Cringley, and Katz. "Soul of a machine" by Kidder.
    (3) Business cycles- moguls and busts.
    Rise, fall, and resurection of IBM. Apple.
    MicoSoft. Silly Valley. Rise and fall and rise
    and fall of gaming. Famous speculative bubbles
    like minicomputers, expert systems, and the InterNet. Books by Stross.
    (4) Literature and movies, including sci-fi.
    2001: Space Odyssy, Asimov's computer and robot
    stories, Neuromancer, Crytomonium(?). Lots of stuff.
    Speculation versus reality.

  • 1) IBM's Role in the Holecaust (Don't know a lot of detail, but have heard that they were hired by Germany to help them track down the Jews in pre-war years)

    Intrigued by this (which I had never heard of), I found a link to more information [aol.com].

    --
    Evan

  • by Azog ( 20907 ) on Sunday March 25, 2001 @07:25AM (#341605) Homepage
    If you want to plan such a course, here are some interesting, readable books that would be useful:

    For the history of Mathematics, invention of zero, etc. read: "Mathematics: Queen & Servant of Science" by Eric Temple Bell.

    Before the internet and the personal computer, two of the major uses and research topics for computers were Cryptography, and Artificial Intelligence. Of course plain old number crunching has always been important, but I don't know of any books on that.

    But for the early history and development of cryptography, check out: "The Code Book: the evolution of secrecy from Mary, Queen of Scots to quantum cryptography" by Simon Singh

    For a read on the early history of AI from a nay-sayers perspective, check out "What Computers Still Can't Do: A Critique of Artificial Reason" by Hubert L. Dreyfus.

    Hmmm. Possible course outline schedule:

    1. History of mathematics.
    2. Pre-history of computers: Abacus, mechanical machines, Pascal, Von Neumann, etc.
    4. First major use of computers: number crunching. The development of mathematical algorithms.
    5. The Artificial Intelligence hype of the 60's and 70's.
    6. Cryptography and computers.
    7. Theoretical results: Turing, the incomputability theorems, equivalence of artificial languages, "All computer languages are the same", define and describe the P=NP question. "All the interesting questions are too hard".
    8. The rise of the personal computer. I think "Fire in the Valley" is supposed to be good for this but I haven't read it?
    9. The rise of the internet...

    Maybe about a week on each of those... you would have to move pretty quick and just hit the high points, but it would be a pretty good "tour" of computer science.

    Speaking of tours, another really interesting book is: "The Turing Omnibus: 61 Excursions in Computer Science" by A. K. Dewdney. It has standalone, easy to read chapters on topics like Algorithms, Finite Automata, Simulation, Godel's theorem, The Chomsky Hierarchy, Random Numbers, Error correcting codes, Boolean Logic, Time and Space Complexity, Recursion, Neural Nets, The Fast Fourier Transform, Public Key Cryptography, Number Systems for Computing, Parallel Computing, Logic Programming, Church's Thesis, Relational Databases...

    Heck, you could teach the course entirely out of "Mathematics, queen and servant of science" followed by the Turing Omnibus. That would cover everything important...


    Torrey Hoffman (Azog)
  • I've given three hours worth of lectures on the history of computing. That's an entirely different scale than an entire course, but I have some idea.

    Swing by your library. There will be more books than you care to read on the history of computing. My school's relatively small library has 31 books [lib.sfu.ca].

    Some ideas that you'll easily find information for in any history of computers book:

    • "ancient" history: the abacus, slide rule, mechanical adder, etc. I've had success photocopying the two parts of a slide rule onto two seperate overhead transparencies. They can be put on the glass and moved around to illustrate their use.
    • middle history: Babbage, Jacquard, Hollerith and contemporeries.
    • five generations: computer historians have divided up the 20th century into five "generations" of computing. You might organize the course around these.
    • history of computational theory: when who discovered what algorithm, etc.
    Some other suggestions:
    • History of the Internet: nobody thought that was interesting until recently, so it's not in most books.
    • History of Games: this will keep them awake. There are a lot of web sites and probably some books.
    • Computers in the media: Changing movie portrayals of computers. Find a communications prof or grad who can give you pointers.
    • Politics and law surrounding computers.

    There are several courses out there [google.com].

    Finally, talk to your undergrad secretary (or equivalent) and have her put you in contact with textbook publishers. A few quick e-mails that say "I'm looking for a text for a course about..." will land you more books than you know what to do with.

    Greg

  • In covering the culture of computers, the biggest problem you'll face is that there are so many different cultures...
  • Much of the setting for the history of software development was set by the languages and paradigms people were using. "History of programming languages" ISDN 0-201-89502-1 is the reference. Covers C, C++, Smalltalk, Ada, Prolog, Lisp, Algol and many others...
  • If you're talking about the story of Grace Hopper finding a moth in a mainframe, be sure that you realize that this wasn't really the first usage of "bug" in a computer-error sense. Check the Jargon File entry on 'bug' for an explanation.

    Other materials you really ought to check out for this class include:

    • Anything by Steven Levy. Hackers for geek culture history (though see the note about this in the end of the Jargon File), Insanely Great for history of Macintosh, Crypto for history of the cryptography movement.
    • Cyberpunk fiction. Gibson is good, Stephenson is better (Gibson's stuff is really not written from the computer geek perspective, Stephenson's stuff is). Vinge is also quite worthwhile, from what I've read of his.
    • Clifford Stoll's The Cuckoo's Egg. He's a psycho luddite, but it doesn't show through in an annoying way in this book. :)
    • Tracy Kidder's The Soul of a New Machine
    • Where Wizards Stay Up Late. Can't remember the names of the co-authors of this book and I don't have my copy handy, but it's a good book about the history of the Internet.
    • Open Sources, an O'Reilly book with multiple essays about the Open Source movement by people who are actually in the Open Source movement (IE, Linus and RMS, and it's edited by ESR).
    • Godel, Escher, Bach: An Eternal Golden Braid, by Douglas Hofstadter. Really doesn't apply to this, but it's a damn good book and I want to put the full force of the Allan Crain Seal of Approval behind it. ;)
    PS: Any chance you could conduct this class at the University of Missouri-Rolla? I'd like to take it.
  • IAAI (I am an indologist.)

    The earliest clear reference to base ten notation using zero (which was a dot) is found in the Yogabhaasya (a commentary on the Yogasutra), which was composed perhaps around the 5the century. The usage suggests that the notation was in wide use already.

    Some scholars think (or used to think) that Naagaarujuna's concept of emptiness (I guess many of you have heard of this: evrything is empty, etc.) had something to do with the notion of zero.

    Paa.nini's grammar (around 5th century B.C.?) uses an idea of zero substitution. A grammatical element is substituted with nothing (zero). Some think it is the invention of zero, in the sence ``nothing does some function.''

    Note that Aryabha.ta was not from the 7th Century B.C., (as another poster suggested) but from the 7th century A.D.

    By the way, Paa.nini's grammar pretty much sounds like a OO computer language. Declarations, short statements, classes, inheritence, functions, etc. Some linguists (I believe) tried to write Paa.nini's grammar program.

  • I have a Jargon File mirror [dyndns.org] if you can't get to the main site.

    --

  • Open Sources, an O'Reilly book with multiple essays about the Open Source movement by people who are actually in the Open Source movement (IE, Linus and RMS, and it's edited by ESR).

    In RMS' abscence, I must point out that RMS is not a part of the open source movement, but rather the Free Software Movement, which has been around longer.

    --

  • by hph ( 32331 )
    This is tangential, and probably offtopic too, and who reads comment #200-and-something anyway?
    Just a thought that hit me: what about the philosophical views if CS? I think there are two big camps of CS (boy, are we over-generalisising today) :
    Those that think that CS is math (von Neumann and 99% of all CS profs :)
    Those that think that CS is language (Ted Nelson)
    Back to drinking more beer.. X-)
  • I have read these books that gave me a lot of insight in how this computing was commercialized (IBM/apple/microsoft). Very light reading too. You may not want them directly on your curriculam, but suggest them as extra reading.
    • Accidental empires [amazon.com] - By Robert X Crignley. Funny & interesting read. Talks a lot about how w IBM lost & Microsoft rised.
    • Show Stopper [amazon.com]- talks about the tale behind creating NT.
    Also show them the video
    • pirates of silicon valley
    • code control - story of netscape/mozilla
  • Does nobody here remember WGBH's 1996 PBS miniseries, The Machine that Changed the World [vt.edu]?
  • There is just so much information, facts, articles, anecdotes, etc to cover. I don't want to be pessimistic, but the fact is I have read a TON of "History of Computers" and they ALL come up short in some respect. Some leave out important details, some contradict others.

    When I say I have read histories, I mean it. I have a couple of "History of Computers" books - wait, let me grab one - - Here it is:

    "COMPUTERS: The Machines We Think With" by D.S. Halacy, Jr. (Dell Books, 1962)

    A very yellowed paperback, I might add. Pretty good book giving a good overview of the history and current progress (for the time) of computers. I have a ton more. Typically, what I have found to be the best "histories" of computers are books that are current for the time, showing the tech where it is at "right then".

    The field is just so large - I daresay you could spend your life just gathering the information for the history, nevermind trying to organize it for printing (if it could fit in book form!). You could spend an entire book on the history of the punch card (and why the text monitor standard was 80 columns wide - they are related).

    One frustrating thing you would find would be the number of dead ends - and lost data on various machines and systems. One area I have always wanted to find out more about was the hobbiest scene of the late 60s to early 70's, just prior to the invention of the 4004 - I am sure it existed, with people making their own machines from telephone relays and other equipment in their garages or basements - but there is so little to go on about such things (I have an article about how to build a telephone dialer system using simple logic circuits and a drum program system to dial a phone - a dial phone, mind you - for when a burglar breaks into your house - simple systems like that were being done, possibly more complex ones existed as well). It is hard to even find stuff from the 70's - I found one book digging through a used bookstore one how to build "The TV Typewriter" - most histories don't even mention it, but it is a big part of personal computing!

    Good luck - you will need it. The best you will be able to do is skim. Perhaps let your students know this, teach them how to find out more about the topics themselves. There is so much out there...

    Worldcom [worldcom.com] - Generation Duh!
  • From the course syllabus [ucalgary.ca]...

    Objectives

    The Department's objectives in offering this course is to provide an introductory to the history of computation - including early computing devices and their uses, and the people who developed them.

    Ideally, students who successfully complete this course will improve their understanding of how the field of computing developed and matured. They will be expected to be aware of the principal people, places, and events that shaped their profession. Such students will appreciate the broad sweep of this branch of history and be able to relate it to the social and scientific changes that were taking place during the same time frames. They will also be able to describe the concepts and show some understanding of the developments and be able to differentiate and make comparisons between them.

    Additional information about a fall 1998 section of this course - namely, a collection of additional readings used to supplement the course text - is also available.


    (Please note - I have "highlighted" those parts of the text which I thought gave insight toward the scope of this particular class)

    I am not saying this class isn't a good class, however, judging from the syllabus alone, it seems to do just what I said could only be realisticly done; namely "skim".

    It is an introductory course, not designed to give an in-depth view of the history. It seeks to only point out "principal people, places, and events", which, while these individuals are important, probably leave out a lot of minor players who made contributions to the history of computing that weren't recognized until much later, if ever (people like Jaquard, with a card controlled loom, directly influenced Babbage, and further, Herman Hollerith, who later help found IBM, which went on to make the standard 80 column punch card, which led to 80 column video displays. I am certain I am leaving out steps here, but the point is this is one known example - there are many lesser known ones, and students of the course will never know about them).

    I feel that this course seeks to point students in a particular direction. Perhaps some will go further with the knowledge gained from it, but most will simply take what was said in the course as "that is all there is", and not find out more about this particular area of study.

    The syllabus admits to the history of computers having a "broad sweep", something that stands out in the course of all history. I dare to think, without having taken the course, that it probably starts with Pascal's investigations and inventions (or perhaps Napier's bones for calculations), and stops at the ENIAC era, with anything after that machine being relagated to "modern" times. But the fact is a lot of investigation into logic and calculation was made long before Napier or Pascal (indeed, look at the Antikythera Mechanism [giant.net.au] for such an example), and a lot of history has been made since the ENIAC.

    Alas, I fear a lot of students will never really know about it, or care.

    Worldcom [worldcom.com] - Generation Duh!
  • <irony>There's no need to mention any nasty foreigners, they don't count. There's no need to mention people like Zuse, Turing, and Flowers, or machines like Colossus, Baby (better known as the Manchester Mk 1), or LEO. After all, they probably didn't really exist. They're just all myths made up by jealous, ignorant foreigners trying to pretend that America is not the sole, universal creator of all that is good or useful in the world. All hail the glorious United States of America.</irony>
  • It should go without saying that the Jargon File should be required reading. Not only is it informative, but it is also extremely funny.

    I second that motion. Follow that link now.

    --
    SecretAsianMan (54.5% Slashdot pure)
  • Don't just cover the PDPs. They were great machines, but there was a lot more stuff going on in the 60s than them. Big, BIG machines by IBM and the Seven Dwarves (read up about this).

    Also, it might be fun to find a collector who has some classic systems and have the class meet at his house once. Just get his permission first.

    --
    SecretAsianMan (54.5% Slashdot pure)
  • I would check out two course web sites. The first, taught by David Mindell at MIT looks quite good. http://web.mit.edu/STS.035/www/

    Also, I taught such a course in 1999--http://instruct1.cit.cornell.edu/courses/sts 355/

    and plan on revising it for the fall 2001 semester. I agree with the earlier post that you will have an abundance of materials. The Ceruzzi volume is a good basic textbook that you might use. Good luck.
  • Please, by all means post again when your lecture material is ready. I would be veeery interested in reading through it. :)
  • It's really easy. Just go interview Katz, 'cause he knows everything about the History and Culture of Computing. ;-)

    --

  • In Fall 1997, I took a class on the History of Computers offered by a then-visiting professor from Stanford. Since then, Professor Paul Edwards [umich.edu] has come to the University of Michigan and brought this fascinating class with him. His web site offers class information from both the older Stanford class [umich.edu] as well as the newer University of Michigan class [umich.edu]. A brief description can be found in the UofM course guide [umich.edu].

    The class was quite a bit of work, but was very rewarding. The final research project was very cool, as Prof. Edwards was very flexible about methods of submission (paper, video, web site, etc.) and topics ranged from women in computing to digital music to a brief history of Soviet computing (and yes, you can write a 3500-4500 word paper about the history of Soviet computing).

    We learned about more than just the history of computers, however. This class forced me to think about how technology affects society, long before such musings became as popular as they are today. We learned about the role that census machines played in the Holocaust, about how a military boondoggle [mitre.org] supplied some of the key components to today's computing technology, how women who played such a key role in the early years of computing were pushed aside, and finally the role of technology in other countries.

    Of course, this class holds a special place in my heart since I met my finac&eacute [umich.edu] while researching my paper, so I might be a little biased :)

  • If you're looking for good background material or for a book that's worthwhile although perhaps not primary, I thought John Van Neumann and Norbert Wiener: From Mathematics To The Tehnologies Of Life And Death (Steve J. Heims, MIT Press) was an excellent jumping-off point when my prof assigned for our Science & The Social Order seminar frosh year...

  • Don't forget to mention Konrad Zuse and the Z-1 computer!
  • For a history of clocks, try The Discoverers [amazon.com] by Daniel Boorstin.
  • the Mark Series and the first "bug"

    I take it by this you are referring to the moth found jammed in a relay of a Harvard Mark II machine in 1947. Contrary to what many believe, this is not the first usage of the word "bug" to denote a problem with a machine. The term was around long before the first computer was ever built -- it's recorded as far back as 1896, and was probably in oral use long before then. The fact that "bug" was already common in 1947 is evidenced by the Mark II's conspicuously-worded logbook entry: "First actual case of bug being found" (emphasis mine).


    Regards,

  • by binner ( 68996 )
    I'm about to enter my final year of University next fall, and if my Uni offered something like that, I'd really enjoy it. I know that doesn't answer your question, but I hope it encourages you even if you don't get the answers you're looking for.

    I'm completely fascinated with computers today, but love reading the histories as well...I can't even tell you how many times I've reading about 'The Dawn of UNIX'!

    Good luck, I hope this turns out for you!

    -Ben
  • s/reading/read/

    And that's after a preview! Way too early!

    -Ben
  • While it's not exactly a typical textbook, per se, CODE: The Hidden Language of Computer Hardware and Software [amazon.com] by Charles Petzold could be a handy reference. It tends to cover the history of computing relatively thoroughly, from a hardware conceptual level. It isn't a book with merely names and dates to memorize the material, instead it seeks to actually explain the insights and advances which allowed computing to take place. As the book progresses, it becomes much more technical, and perhaps not as relevant, but I'd imagine that most of the book could be useful for a course like this, if nothing else to give you some background/research material.

  • Mike Williams, now a Professor Emeritus, began such a course at the University of Calgary in the early 80's.

    I believe he also spent a year as the Guest Curator of the History of Computing wing of the Smithsonian (and wrote the last program ever plugged into ENIAC, which now rests there).

    It's still running at the U. of Calgary as CompSci 509.

    Prof. Williams has a History of Computing Web Site [ucalgary.ca]. ...and just clip the last directory level off that to get his own web page.

    Best of Luck.

  • Check out the Slashdot review of Darwin Among the Machines: the evolution of global intelligence [slashdot.org]

    I just completed the book. It is essentially a history of computers as it relates to philophy, math, and the nature of intelligence. Probably not the school textbook type, but perhaps good for the class recommended reading list (or your own).
  • Going a little bit farther with this it might be a good idea to spend time discussing "firsts" of the computing era without pointing a finger to the absolute "first". With computers it becomes harder and harder to actually point to an idea/device and say it was the first of its kind, because so many similar ideas came to fruition at approximately the same time. Yes, John Atanasoff and Clifford Berry hold the US patent for the first digital computer, but it doesn't mean that they should be the only people who hold the honor of being first. I suggest that this point should be made relatively clear so as to curtail heated discussions in the classroom, unless of course you want that kind of thing.

    Of course, if you even _think_ of calling ENIAC the first I will have to find you and hurt you since Mauchly visited Atanasoff and discussed with him his ideas on this electro-mechanical marvel and even observed the uncompleted ABC where it was being built in the basement of the Physics building of ISU campus in Ames, IA.

    The Smithsonian iirc has a working model of the ABC that was built recently from Atanasoff's notes and spare parts lying around the US.

    This message was brought to you by a CS grad who spent 4 years at ISU being indoctrinated in how great the ABC was.
  • Of course it is always possible that seperate cultures with seperate languages who had number systems stumbled apon zero on their own. I mean it's not like it's something complicated... like one click shopping.
  • I would recommend Rebel Code [thinkgeek.com]. This will give you a good piece of Open Source history, Linux, Free Software Foundation, GNU, etc. Very nice piece of writing.

    ...and don't forget to put on-line your course notes, when you are done :)

  • Someone else mentioned Alan Turing's thoretical contributions. His work at Blechley Park in England on code breaking during WWII is also fascinating. The book "Alan Turing: the Enigma" by Andrew Hodges makes good reading. (I am not a shill and derive no financial benefit from sales of this book.) See http://www.turing.org.uk/ [turing.org.uk]

    --

    --
  • If you haven't seen it, go rent (or buy) "Triumph of the Nerds." I'm old enough to have lived through the early days of the personal computer and I thought it was very entertaining. I highly recommend it as background on the PC wars aspect of the history of computing.

    It may not be very scholarly, but its lots of fun!

    John [gray but grinning] Miller
  • I've done some reading on this subject, and the culture. Useful stuff and resources:
    • Where Wizards Stay Up Late - a history of the Internet from about 1940 through until 1980. Covers John Licklider (the psychologist who invented the neural net application to computer networks), John Postel, Vint Cerf and more.
    • The Nudist On The Late Shift - a very good account of the culture of Silicon Valley in the mid-90's.
    • Geeks
    • Anything by Robert X Cringely.
    I'm away from home at the mo so can't completely check the library, but I strongly recommend "Where Wizards...".
  • Don't forget the work at Bletchley Park in WW2, building the first digital computer...it's easy to because for so many years it was secret.

    One approach that interests me in the history of computing is the cumulaive nature of applications.

    What makes a computer different from other inventions is the fact tha tpeople keep thinking of major new applications for them. You could take an applicationa t a time and show how that idea started and how it developed. That way your work isn't strictly chronological, but goes back and forward. Example applications:

    Originally, Computers were built to do hard sums - babbage's machines were designed to do statistical analysis in the search for ways to cheat at gambling! And that function continues today with the supercomputers predicting the weather.

    data processing from the early business computers to the modern storecard systems that track every time I buy dogfood.

    Personal productivity devices - the initial visicalc through to Office XP (Bleaarggh!) or PDA's

    Communications devices - POS terminals, cash machines, the net, e-mail

    Games machines - pong to Quake

    Control devices - Apollo program

    Fun toys for geeks - some Linux history in there - but also things like the altair and the Sinclairs

    etc.etc.
  • What really makes me mad is when they leave out the popular home computers of the early 80's. The Commodore 64,Coco, Atari and a few others all had a greater influence than the Mac or Apple ][ ever had. Please include these models in your history.
  • Studying the ancient formation of computers is unfortunately the typical approach of studying the "history of computing". It's about as fun and informative as studying that butterfly in China that causes the hurricane in Charleston. Without studying all the intermediate steps, and without then taking a meta step backwards, it's meaningless facts to be regurgitated on an exam with no application to real life, no generalized understanding of society, humanity, and technology, and no imparting of the ability to predict the future (one of the popular reasons given to study history).

    In my opinion, what is needed in this history class, as with all history classes, is to begin with a complete graphical timeline, and then to selectively "drill down" to key inflection points, and relate those points as effects of what came before and causes of what comes later.

    The rapid pace of computer technology and knowledge means that a large portion of the progress came from the youth, and so to study the "culture of computing" means to study the youth of computing, e.g. Steve Jobs rebelling at Atari, the Atari 800 and Commodore 64 training the computer programmers of the 1990s, and of course the creation of Linux. Each generation, of course, also had its "old guard", and the interplay between the old and new guards are interesting. Inventions by the "old guard" include the transistor, microprocessor, and UNIX/C. These were quickly adopted by the youth and rapidly advanced, until these youth grew up and became their own "old guard", etc.

    From analyzing a few of these scenarios, it should be possible to abstract (meta) out some "patterns" (and I am using this word in the "patterns of organizations of people" sense). From these patterns, it would be an interesting exercise to apply them to the present day in order to predict the future -- come up with some different scenarios. Archive the predictions, then rate yourself in a few years in the same sort of class, and refine the process. This refining of the process represents another meta step outward.

    Then you'll be at CMM-3. Right now the class sounds like CMM-0 -- perhaps some good raw material, but not very effective.

  • Hop on your local USENET news server and checkout alt.folklore.computers. Or for you web-oriented folks, checkout Google's web based frontend for alt.folklore.computers [google.com]. Many of the people responsible for the history of computers are lurking around there including people who worked on Multics, early IBM mainframes, and DEC PDP-10 systems. People on the a.f.c newsgroup can provide first-hand accounts of the historical culture surrounding computers. They also offer detailed information on the technical aspects of these systems.
  • And again recently with the demonstration of a reproduction of the ABC, proving that it did work.

    ABC History [iastate.edu]. Also a bunch of general History of Computing links.

  • Some of that sounds a little like the labs I did back in Embedded Systems class - learning to program with the peek-and-poke equivalents in BUFFALO on an HC11 (with two data and two index "accumulators"). Our prof actually told us that that was how students in the 70's used to program the school's old PDP-10. It was actually a very entertaining challenge to work with those kinds of limitations.

    I like this hands-on idea. For the full experience, though, don't forget to give them an allotment of computer time, then charge them for more when they use it all up the first time they create an accidental infinite loop.

    "If I removed everything here that I thought was pointless, there would be like two messages here."

  • Six female mathematicians programmed the ENIAC during World War II and were the arguably the world's first computer programmers. Their story was largely forgotten until Kathryn Kleiman did her undergraduate thesis as Harvard about them and got people interested in their story.

    Read about it in the following links:

    http://www.witi.com/center/witimuseum/halloffame/1 997/eniac.shtml [witi.com]

    (the above link has pictures of the six)

    And:

    http://www.wired.com/news/print/0,1294,3711,00.htm l [wired.com]

  • May I oh-so-modestly direct your attention to eCompany Now's Web Guide [ecompany.com], which is my day job. It contains an extensive section of cataloged resources on Technology History [ecompany.com], including a large subtopic pertaining to Computing History [ecompany.com].
  • ... was recently taught at Metroplitan State College of Denver. I didn't have time to take it, unfortunately, but I'm well acquainted with the professor who taught it, and I'm sure she'd be glad to help you out. Her name is Dr. Judith Gurka and her e-mail is gurka(at)mscd(dot)edu. (Sorry for the spamtrap style, but you can parse it out.) Tell her I sent you.
  • > Microsoft Certified Developer dot edu?

    Hah hah hah.

    Metropolitan State College of Denver.
  • Is that there are so many conflicting reports. Who built the first computer? What is a computer? Who actually invented the GUI? Who invented packet switching, and who implemented it first?

    It goes on like that. Every area seems to have difering accounts. It will be hard to combine all of it into a single, sensible course that doesn't contradict itself.

    Of course, the best place to start are books that have already been written. Check out Steven Levy's stuff (Hackers & Insanly Great), also Where Wizards Stay Up Late, Fire In The Valley, and loads more on top. Slashdot has done loads of reviews on books like these, search a bit.

    Oh, and please, don't teach the falacy of the Internet. It wasn't built to survive a Nuclear attack at all.
  • <RANT>
    As many other posters have pointed out, how can you give a course about the history of computing and ignore Alan Turing ?!

    Turing was undoubtably one of the great minds of the last century.

    Have you ever stopped to think what the world would be like today if Alan had not been around when we needed him - the 2nd World War quite possibly would have been lost to Germany, stored program computers might not even exist or at least would be in their infancy, everythig stemming from stored computers would not exist. There are so many things we take for granted that Alan Turing through his brilliance has given us.

    And for what thanks I ask you ? None, that's what. Denied when he was alive, possibly conspiratorily murdered (because of his at the time controversial & illegal sexuality (he was (fairly openly) gay)), and unknown to almost everybody now he is dead.

    Strange isn't it - ask somebody who "Bill Gates" is, and chances are they know, ask somebody who "Alan Turing" is - they won't have a clue. But who is/was more important to the world.
    </RANT>

    ---
    James Sleeman
  • It's a New Zealander's perspective, so perhaps it is also a British perspective. You are right in saying that without breaking Enigma the situation could still have been resolved by the Americans and thier blasted bomb, but it would have taken much longer, and Europe probably would not have recovered - destroyed by Germany, and then destroyed again by a few carefully placed bombs. If it wasn't for Alan Turing, the humankind would be a much worse off than we are now.
  • For those of you who have clueless friends out there, I've been doing a Jargon of the Day feature in my Weblog: http://www.io.com/persist1/log.php [io.com]... for those who need a gentle introduction.
  • by FTL ( 112112 ) <slashdot@neil.frase[ ]ame ['r.n' in gap]> on Sunday March 25, 2001 @06:54AM (#341654) Homepage
    It should go without saying that the Jargon File [tuxedo.org] should be required reading. Not only is it informative, but it is also extremely funny.
    --
  • Just one datum: my brief experience with lecturing was that it took 8 hours to prepare a one hour lecture from scratch.
  • Your description seems to lack attention to maintaining a consistent view. Working with the origins of "zero" or positional number systems on the one hand, and worrying about the a study of a subculture of *nix users is too rapid a move from the view at 50,000 feet and 5 feet. Is this a big-picture course or a detailed examination of modern culture?

    The first, in many ways, is a history of mathematics. The second, modern cultural anthropology. There may be connections and contacts between the two, but not enough to meld courses on those subjects.

    Since you include ANALOG computing you open the door to non-positional number system computing. The ancient Greeks (and others) made substantial use of geometry to calculate lengths, areas, etc without a positional number system. I can only guess at what the Chinese might have done.

    One does wonder if the 'culture' of the slide rule as well as the 'technology' of the slide rule culture would be included in your course. I can see the chapter now: "the slide rule and the pocket protecter."

    Good luck with the course design!

  • I've been reading a book called "The Nothing That Is - A Natural History of Zero" by Robert Kaplan. Probably goes into much more detail than you want about who invented/re-invented the zero, but it's an amazingly enjoyable read.

  • fred and sheila are your friends
    .oO0Oo.
  • in fact post it for free seeing as your expenditure is time and you'll be profiting fromthe free time of others
    .oO0Oo.
  • look at the culture of Slashdot

    culture, it's pure barbarism!
    .oO0Oo.
  • There are several of these classes already in existence. you may want to check out david silver's resource center for cyberculture studies http://otal.umd.edu/~rccs [umd.edu] . The core text I would imagine as Ceruzzi's "The History of Modern Computing" another great classic book is Susan Leigh Star's "The Cultures of Computing" of course if you want to go for the more popular read, then Hackers. I haven't read "The Universal History of Computing" yet, but it does cover a broader basis.

    some other sites of interest are the IEEE annals of the history of computing: http://computer.org
    Virginia Tech and the NSF's history of computing: http://ei.cs.vt.edu/~history/index.html [vt.edu]

    to get another audience on this you may want to ask you question on or join http://aoir.org [aoir.org] the association of internet researchers

    My own work and teaching is centered more around the Internet, but it seems to me you want to look at the earliest foundation of computing, such as the origin of information theory, which is quite interesting in itself.

    Jeremy
    Center for Digital Discourse and Culture
    http://www.cddc.vt.edu

  • They sold punch card based systems to track jews.

    They actualy sued the Germans for damaged material at the end of the war.

    Basicaly they saw a business oportunity and took it, making it easier to kill jews in the process.

  • I am one of those odd Computer Science & History double majors..Currently working on my senior thesis..
    I tried something similar to what you are going to try, or are attempting..
    I have ended up writing a paper on the the life of Alan M. Turing..(basic founder of modern comp sci)
    The History department wouldn't go for it the whole history of computing idea.. so before you get to far into it.. if not already..
    consider the following..
    1) I ran into the problem of no one in the History Dept was willing to mentor my project thesis..as keeping the
    "Historical" approach to it, as well as primary materials to be used.. , and it was also so very recent...
    2) Is this going to be taught as a History Class or as a Computer Science or an MIS class? As that would be three completely different classes in itself.. I am TA'n a Computer Law and Ethics class and its really sad what seniors and juniors in computer science are turning in as far as their writing skills.. analytical ability when writing.. etc.. some of them don't even know how to use a word processor, but they know how to code(well some of them), which is just sad..
    3) Though a lot of computing history has been in the last 5 to 10 years.. you shouldn't spend too much time on it..
    4) Break the class into topics and show the evolution of thoughts from start to finish.

    A suggested Topic Breakdown in no particular order.. mostly just a braindump..
    1) Introduction, what is a computer, and the originations and overview.
    (Don't over dumb the "what is a computer part", or over technicalize it either)
    a) The Word Meaning
    b) Early Thought
    c) Mechanical Computers
    2) Basic Ethics of Computers, Setting a baseline to what the class will be taught on.
    3) Early Computers Pre-1940
    4) Birth Of the Computer as we Know it
    a) The Impact of WWII
    1) IBM's Role in the Holecaust
    (Don't know a lot of detail, but have heard that they were hired by Germany to help them track down the Jews in pre-war years)
    2) Artillery Projection
    3) Cryptology
    4) Massive Influx into Research Funding for the first time..
    b) Post War Thought - From Concept to Reality
    c) Impact of the Cold War
    1) Large Numbers of "SuperComputers"
    2) Need for Networking
    3) Need for Mass Storage
    d) From Vacuum Tube to Transisitor
    e) The Impact of the Space Race
    f) Academic Research
    5) The major computer manufactors and their originations.. and evolution
    1) IBM and their eventual monopoly
    2) Intel
    3) The Former Cray Computers
    4) Unisys
    5) Sun Micro Systems
    6) Microsoft
    7) Motorolas
    8) Compaq DEC (VMS)
    6) Birth of the Personal Computer 1979-1992
    A) The major early players (Describe the Differences in Platforms (I.e. Differnet OSs, Different Processor Base, Corporate Methods, Pros/Cons)
    1) Commodore (1979-1994)
    2) Apple (1979-Current)
    3) IBM (1981->) and its compatibles
    4) Misc Others
    5) The First Wave of PC computermanufactorers dies off.. (approx 1985-1989)
    6) Description of Apps and Hardware Available to them
    7) The role of Mainframes and networking in general during this time..
    7) The Web Boom (1992-2001(current) (It may be over, or next period may be beginning)
    A) The Early Web Years 1992-1996
    1) Early Online Services (AOL, Prodigy, CompuServe) and BBSs How things were done before the Internet ( I personally ran a BBS through 1996)
    2) Internet before the Web
    i) Government Origins
    ii) Telnet, Gopher, and FTP, early utils available.
    iii) Birth of HTML and the Web Browser (1992-93)
    Lynx, Mosaic, Netscape, and later IE.
    3) The Web Initially (1992-1995/96)
    i) Very Basic, Plain HTML, Slow, Expensive
    ii) Very little E-Commerence
    iii) Still Mainly used by Academics, Government, and Just starting to catch on.
    B) The explosion..
    1) E-Commernce
    Created a demand for Access and Demand for Web Servers
    2) Linux Birth and Unix Revival (Begins)
    3) Wide Spread Access
    4) Cheap PCs
    5) Faster Access (Broadband)
    6) Hardware Getting Exponentially better and cheaper every few months
    7) Inceased Availability of Software
    8) GUI Interfaces, though introduced in 1985 by Commodore on their Amiga PCs, 1986 on Mac, and 1987 or so in MS Windows. Xerox has one of the original ideas.. but they only were using it on their copiers and such..

    Well I just realized how long I have spent replying to this.. and I need to get back to what I was working on.. but if you want more ideas, etc.. email me.. I have a few books.. though I wouldn't recommend the book I am currently using in the class I am TA'n, as it is more a computer ethics class book.. but it ain't bad overall..
    It could be quite a project though.. have fun.. Let me know what you go with..
    But as I said I gotta go and get some things done so I can get out of college this semester..

    Chris Souser aka Volhav
    volhav@acerbic.org

  • "The Turing Omnibus: 61 Excursions in Computer Science" by A. K. Dewdney
    BTW, that books been updated: its now called: "The New Turing Omnibus: 66 Excursions in Computer Science"
  • Almost every digital computer has a clock in it. You should devote one lecture to the history of clocks.

    Some of the fancy cukoo clocks built in Europe at the height of their popularity could arguably be called the first "robotic multimedia" computing devices.

  • I'd love to know where this guy is located. I'd like to help out, but more importantly, I'd love to enroll in that college to take the course! Good luck, man.
  • No screw that i ant the f-ing history of foobar right now.

    I am sick learning some language and being forced to type
    @foo + @bar =

    or
    #/usr/bin/perl -w
    use strict;

    my ($foo = bar);
    print foo;
    ---
  • Take a look at: http://ftp.arl.army.mil/~mike/comphist/ [army.mil] Compiled by the late (but Great!) Mike Muuss (creator of ping and BRL-CAD).
  • Another one that might be useful : The Nothing that is, A Natural History of Zero, Robert Kaplan.

    The flowery, overly verbose language can get a bit annoying at times, but it is a reasonably interesting read.
  • As far as IBM computers/calculators are concerned, there are two authoritative books:

    "IBM's Early Computers" by C.J. Bashe, L.R. Johnson, J.H. Palmer and E.W. Pugh, 1986, MIT Press

    "IBM's 360 and Early 370 Systems" by E.W. Pugh, L.R. Johnson, and J.H. Palmer, 1991, MIT Press

    Both part of an MIT Press series on the history of computing: http://mitpress.mit.edu/books-in-series.tcl?series =History%20of%20Computing

    The two books on IBM computers give a lot of internal IBM culture, but one should be aware that the authors were all IBM employees when the books were researched and written. This gave the authors excellent access to IBM sources, but also gives the books a decidedly IBM point of view. Caveat lector.

  • I would have felt so much better if you would have at least mentioned Turing in your post to Slashdot....go back and do your homework :-)

    Slashdot featured a link to a pretty good article [umaine.edu] that holds a very cute and short intro to how Alan Turing got to his infamous Turing Machine and the start of computers and computer science. It does a quicky mathematical history from Cantor through Hilbert to Godel and Turing.

    I have studied computer science and find that certain facts about the history of computer science seem go better with the non-computer scientist audiences (the friends and family I try to explain too what computers are). Your lessons should at least cover the following topics:

    Explain what the generalized Turing machine is and how it was (and still is) used to describe the 'limitations' of computing machines.

    Explain with as little math as possible what NP means and what impact it has on computing.

    Explain Moores law and compare it to other industries to show that computer science is something very very new in our world history.

    I recommend reading "The Age of Spiritual Machines" by Ray Kurzweil. Both an interesting overview of computing history and future.

    Traa
    ---------------------------------
  • Well, for a start there's the IEEE History Of Computing [computer.org] page.

    There's also the University of Manchester Department of Computer Science history [man.ac.uk] and "50 years of computing at Manchester." [computer50.org]

    Or the Alan Turing Home Page. [turing.org.uk]

    Alan Turing used to drink at the Salisbury Arms, on Oxford Road in Manchester, which although serving a decent pint, is now way too packed in the evenings to be able to think in base 32 anymore.

  • There's also The Computer Museum History Center [computerhistory.org], though I'd be careful. I spotted 3 major omissions/mistakes in the first ten minutes of reading the site.
  • Last semester our seminar course was based on Computing history. Let me tell you that it was the worst thing that I ever had to sit through in my entire life!

    I ended up talking about the history of super computers, which was not that bad, but most of the people decided to talk on people like Babbage, Turing, etc.

    Big waste of time!

  • do not dismiss foobar so lightly young man. Metadata is important stuff when learning a new language and a commonly-understood set of meta data (foo, bar, baz, qux, and (for php users) needle and haystack) help you seperate the user-defined components of the language from the keywords. Very important!

    If you doubt it, look at the foo-less world of DOS/Winders. Their idea of metadata is to prepend "my" to everything. Would you rather have:

    myvariable = "mystring";

    or

    foo = "bar";

    I must admit, however, that my favourite metadata usage is in the php manual:

    strstr(haystack, needle);

    very clever and much better than:

    strstr(mystring, myotherstring);

  • You are not going to find much as very little has been written yet. Your idea is excellent, and a text on the subject would be an excellent Doctoral thesis. I would concentrate more on the people that built computing rather than the hardware an events. There may be a biography published on Grace Hopper. She had the most influence on the Computer culture as it grew. She was the original guru. Denis Ritche and the Bell LAb boys are still alive, interviews might be possible. Ward Christenen built the basis for communication between computers (x-moden, y-modem, z-modem etc.). He gave all his work away and the distributions on 8" floppies spanned several disks, all his work. 'The Cathedral and the Bassar' (probably misspelt) by Erik Raymond is the only publication I know of that directly address computer culture. It covers about 1985 to the present but focuses on the open source community only. Good luck Tom
  • Some points that I think are important (in no particular order) are:
    • Culture clashes between sub groups in the industry (Mac vs PC, vi vs emacs, MS vs Linux) and the characteristics of the individual subgroups
    • Culture Clashes between subgroups in terms of levels of expertise (newbies vs experts vs old timers) as a side note, the phenomena of the "September that never ended" is educational
    • Culture clashes with the outside world, this starts touching into the hacker ethis, etc. but also is illustrated in things like comments made to Babbage (along the line of, "if we put in the wrong questions, will it still give us the right answers?")
    • The size of some the communities often has been much smaller than would have been imagined from the eventual impact. The original hacker communities in the 1980's did not number thousands, more like a few hundred, with a few dozen core experts. As such, there is often a certain provicialism that creeps in from time to time. The world is often not seen as being as big and diverse as it really is.
    • The resemblance of some communities to a religion (Mac evangalism, for example. But there are many others) and the clashes this creates.
    • The unsung hereos, people who invented the basic technology, and who never saw a decent return. (I still thing everyone should send the guy who invented the mouse a buck or two just to say "thank you!". This should be a community project of some sort)
    • The Dead Media Project, found at deadmedia.org [deadmedia.org] (and as noted earlier here [slashdot.org] on /.), is an interesting overview on the obsolescence of technology. the articles on the Information technology of ancient Athens are particularly worthwhile. (Seen here [deadmedia.org] in the numeric listing as items 38.6 - 39.0)
    • Also, Scientific American had a recent story [sciam.com]that mused about being an information technology worker in Mesopotamia, crunching the numbers that made the cities work.
    • The generations of older technologies, including the older tube computers, relay based logic (which is wild in it's own right), and even music technologies such as used by Raymond Scott [raymondscott.com], (teacher of Robert Moog)
  • As a good little human factors bunny, I must say: please don't forget interface. To most reasonably non-technical users of computers, the interface is the computer. Cliche, yes, but true -- you might very reasonably divide your chronology into
    • hardwiring
    • punchcard command line
    • smart terminal
    • early GUI
    • ...


    Not that that's a comprehensive list or necessarily a good organizational conceit, but from an end-user perspective, it's what's important. (A caveat: the "end-users" of punchcard systems and such were very unlikely not to be techies.) This is, as usual, the lesson that some of the Slashdot crowd would do well to learn from Apple: not everyone cares what's under the skin of their computing environment. Not everyone knows or wants to know or needs to know.

    Steven Johnson, founder of Feed [feedmag.com], has an excellent book, Interface Culture [amazon.com], which details the progression of computer interfaces and turns a critical eye on their cultural and psychological implications.

    I'm reading this book now for a seminar on human-computer interaction, but it's just as applicable to the sort of course you're proposing. What I particularly like is its focus not just on the development of GUIs but also on the power of textual and hypertextual interfaces, which honestly are the only popular interface innovation of the last decade or so, since the desktop-metaphor GUIs have stagnated.

    Good luck with your syllabus. Count me among the folks who would like to see what you come up with. (For the record, I really like the idea of teaching about the history of computer science problem-solving (i.e., NP complete problems, number systems, etc) ... this is often overlooked in favor of histories of the big iron that folks were building, but the math is just as important.)

    Andrew
  • by w00ly_mammoth ( 205173 ) on Sunday March 25, 2001 @07:12AM (#341689)
    For a development so recent, there is considerable controvery over some basic details. For instance, consider GUI. Popular folklore attributes it to Parc, followed by Apple. But there is another viewpoint, the almost forgotten Engelbart. [stanford.edu] It's amazing that with all the people involved in these inventions still alive today, nobody quite agrees on who invented what. It's another matter trying to figure out who invented the first computer. [time.com] I can't imagine what it will be like in a hundred years, when people look up contradictory records postulating various different accounts. Good luck trying to piece it together.

    As for older theoretical subjects, one book you'll find invaluable is Donald Knuth's The Art of Computer Programming, in which he painstakingly traces back the history of various mathematical and computational developments.

    Who invented the zero?

    w/m
  • I had a "survey" course taught by a brilliant if somewhat difficult professor. It was an undergrad "200" level course, so roughly a 2nd - 4th class in the field. From transistors through assembly to high level with algorithms and data structures, it attempted to give an overview of the whole machine. Note that it did not directly address cultural issues, etc. (although these came up in anectdotes and stories, inevitably...).

    Many of the students found this quite challenging and a bit overwhelming. I had a bit more background and so could place many items into a pre-existing framework of knowledge. I found the course very interesting and very much appreciated the context it provided to topics addressed elsewhere in isolation.

    Regarding your goal, I see it as a good thing. I also support the idea of offering it to those with some prior classwork and experience under their belts. Too many people I meet these days don't have sufficient understanding of the context in which they work and create. The ranks of the old "gurus" is thinning, and their stories and hard-won knowledge are becoming scarcer. I believe giving current students some knowledge of what has come before is a good thing.

    The "early days" also produced a sense of community that should be remembered. (Note that this happens in many fields, not just information systems or whatever you call it.) I recall getting a new editor on the school VAX just by asking and by being willing to "sneakernet" a tape from someone's employer. They were happy to see it used. With all the "business" going on these days, it's nice to remember the generous spirit of some of those who contributed so much to getting us where we are, now. And who found facination in the topics themselves, regardless of their financial worth.

    As for books and such, I can recommend:

    The Hacker's Guide, 3rd ed. -- The Jargon File plus a bit, stamped on a dead tree.

    Where the Wizard's Stay Up Late -- the birth of ARPANET and subsequent things.

    Don Knuth's writings. Aside from being thorough, he's informative and funny.

    Good luck!

  • Hey, don't limit yourself to lecturing. If you want people to get the feel for the history of computing, have them experience it. Get yourself some vintage hardware or some emulators. Actually have the students write programs on punch cards. Have them struggle against the 3583 bytes total memory of a VIC-20. Have them flipping switches on a little box with lights and nothing else. Have them write in assembler, or better, machine language. Nothing too complicated, of course, but then they will appreciate the real progress of computers -- "You mean somebody actually worked like this every day for years?"
    -----------
  • The big problem any history course would have is that the materials arround to date tend to be as biased and one sided as those in any other field.

    A lot of info can be found in the history of cryptograph litterature. Many of the pinoeers of pre-digital computing were also into cryptography, Babbage broke the Vinegre cipher, Turing brok Ultra etc.

    I would not spend more than one week on the pre-machine computing era but the idea that choosing the representation is the key to making systems computable is important. Place-value systems are important because they make the abacus possible

    On the invention of the electronic computer there are a bunch of competing claims. Before WWII Konrad Zues built the Z1 and Z2 computers which were the first programmable electronic devices to be built and working. The Bletchley park systems were the first major electronic computational devices - although programability was limited.

    You could also look at Feynman's biography where he describes the computer facility of the Manhattan project, they used tabulators for computation. [Kind of brings into questioon the effectiveness of ITAR restrictions on 'supercomputers' to 'stop rogue nations building nuclear devices' eh?]

    After that it depends on whether you read a UK or US history. Both tend to imply that the other groups didn't exist - although in practice they were often collaborating. Manchester tends to be ignored in the US histories, as for that matter is the fact that the US was nowhere near dominant in computing until comparatively recently.

    There is quite a bit of computer history in the book 'Big Blue' which is a biography of IBM written by the anti-trust investigator who tried to break them up.

    Another era not to forget is the Micro-vs-Mainframe fight. Again the real action here was in Europe where manufacturers were building $100 computers aimed at the consumer market that sold in millions. Sinclair outsold PET, Apple and Tandy put together in those days. A lot of that was driven by the video-game scene

    There are a bunch of books arround like Where the Wizards stay up late, architects of the web etc. However beware that many of these were written as corporate PR puff pieces. 'Wizards' is mainly an account of the BBN view of the Internet. 'Architects of the Web' is a hard core haigography of Marc Andressen written by the Netscape PR firm, Tim Berners-Lee is mentioned only in passing to explain how he got everything wrong. It was an attempt to rewrite history with Marc as the sole inventor of the Web.

    A lot of the other computer histories tend to fit in the same mould.

  • Nobody invented zero.
    It was never there to begin with.
  • Computer Architecture : A Quantitative Approach by John Hennessy, John L. Hennessy, David Goldberg, David A. Patterson Morgan Kaufmann Publishers; ISBN: 1558603298

    This is mainly intended for studies of computer architectures and instruction sets, but goes into a fair amount of detail with additional reading suggestions on the history of the computer. It covers from early computers all the way up to the most recent (Pentium series). It is primarily devoted to teaching computer architecture using the MIPS instruction set, but has rich information throughout on practical aspects of computing, evolution of Intel's dominance in the PC chip market, downfalls of some of the many forgotten companies that were early innovators (and computing giants for that matter), evaluation benchmarks, comparisons between Intel and Motorola processors. The list goes on and I have only read up to chapter 6. I highly recommend looking at this as part if you can find it in a library. Otherwise it is fairly pricey...but it keeps on giving. Pretty light on the culture side though.

    Other than that, I don't think that this class would be complete without the introduction to Moore's law and its predictive assertions as to the future of computing. His original paper is a good start.

    Cramming more components onto integrated circuits [intel.com]

    Gordon E. Moore

    Electronics, Volume 38, Number 8, April 19, 1965

    Lastly, some mention of the current efforts being made to surpass the limitations observed by Moore's law in the fields of nanotechnology and molecular computing may be worthy.

  • I see a bunch of you waiting to jump on the bandwagon "yeah yeah i wannaaa take that course". Here you have a bandwagon created by someone who knows so little about the subject that they have asked the question on /. "where can i find this info". This is my pet peave. UNI courses created by people who dont know the subject matter already. I once had to take an intro to computers course at a tech school as a prerequesite to some coding courses. The teacher told me blantently, "I dont have to know anything about computers. I'm just here to do my job whatever it is they tell me to do." It was true. She knew nothing about computers. It was a daily task. At one point in the course she was to explain hexidecimal and binary number systems. When it came down to the wire, I had to explain to her and the class how counting in hex works because in essence, people take base 10 for granted while never understanding what baseX really means. That is to start at 0 counting to the last digit and moving to the next place repeating the 0 to last digit count....so you could count in hex as easily as you could count in base 10. She was the teacher in this class. yet she knew nothing about the material and in fact had no interest in it. My friend, a year is not enough. If you have lived the history of modern computing as I have, if you have coded on a C64, an Apple][, a coco, an IBM-PC(8088), XT(8088) AT(286) 386 486 Pentium/II/III, watched Gates rise to power sen the internet from a telnet shell, irc from a printer/kb terminal, bbs'd at 300baud on up to 56k ppp dialups and finally to dsl anda modern ethernet based internet, you have lived slept eaten and breathed the history of comptures and love them as if they were sentient.....then perhaps you would not be asking such a question "where do i find this info". Why are you teaching this course? I dont want to be mean about it, but its exactly this sort of thing that we are dealing with here that causes the acute stagnation of the industry's development that we now have. M$ is not innovation. Ask Gates and he will tell you that it is. Ask me and I will tell you that I have lived it from the beginning when M$ was nobody. From when the PC was just being conceived. I ate the history. I dont have to ask where to get it. It's all over the net. Ask yourself again why you are doing this. Ask if your contribution is stagnation or innovation. Ask yourself if you are doing this to keep your job or if you should even keep your job. Ask yourself if you are contributing to the output of misguided computer industry people that we are now producing. Does everyone still want to jump on the M$ band wagon or learn from ppl who do not know? Or would you rather learn from ppl who love and breath the machinery we work on and truly know the history? I am not a Dr., but I do know when I'm sick enough to go see one and I am smart enough to know when he's double talking me. Also, I am not a professional teacher but I do knwo this: Teach what you know. Leave the teaching of what you do not know to those who DO know.

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...