History and Culture of Computing? 150
living phoenix wrote: "I'm currently working on the feasibility of teaching a class in the history and culture of computing for a collegiate senior thesis project. Basically, I would spend a little over a year studying the subject matter to create enough class materials for 3 hour per week class sessions for 16 weeks. I would like to cover, in very brief terms, the "invention" of zero and the positional number system as it relates to computers, mechanical computers including Babbage's Analytical Engine, analog computing, ENIAC, EDVAC, the Mark Series and the first "bug", the PDP series up through the moderns with shorter stops at the creation of the internet and systems design. This is a massive undertaking for me, in part because I have so much research to do to simply select the points that are best suited for a 16 week course. Has anyone ever taken a course such as this, heard of such a course, or know anyone who has taught the course? Also, I'm making a request for research materials, if you have a text that you thought was intriguing and/or would pertain well with the course objectives please let me know so I can use them in my research." Well, just searching slashdot I see a CNET article and a book review that will help you out. And don't forget the history of Unix. My guess is that there will be too much material - the hard part will selecting what is important to keep in and what can be left out.
foobar (Score:1)
Textbook or Book.... (Score:1)
I would guess many others are interested in this subject (I am, as a grad student and a teacher) and would be interested in taking this course and/or buying the book.
Good luck, and great idea!
Re:Jargon File (Score:1)
Must...Not...Have...Own...Ideas.
Must...Get...Culture...Prepackaged...From...Boo
Eh? Heh, I'm taking that class! (Score:3)
Anyway, I'm taking a class with Arthur Norberg [umn.edu], and we're using 4 books:
Where Wizards Stay Up Late: The Origins of the Internet by Katie Hafner and Matthew Lyon
Computer: A History of the Information Machine by Martin Campbell-Kelly and William Aspray
The Soul of a New Machine by Tracy Kidder
Transforming Computer Technology: Information Processing for the Pentagon, 1962-1986 by Arthur L. Norberg and Judy E. O'Neill
--
Museum (Score:1)
I don't know whether you have the time and money for the trip, but a very good recommendation (worldwide) is the German Museum of Technologies in Munich (Deutsches Museum [deutsches-museum.de]). They have a very extensive coverage of computer sciences beginning with simple mechanical caluclators like the ones of Leibniz and Pascal over the (working!) Zuse Z3 (a 1945 computer) up to the modern developments of today's microprocessors (but not operating systems or software).
Very interesting is to take a guide and have it all explained to you. There is also a book available at the museum store, which can possibly be ordered online and in English (Publications (German) [deutsches-museum.de]).
Possibly there is also a museum about history of sciences somewhere in the U.S., but if you plan a trip to Europe anyway, the German Museum of Technologies is definately worth the time for you.
Sebastian
Possible Material (Score:1)
les
Re:Who invented what? (Score:1)
w00ly mammoth wrote:
Who invented the zero?
There are a couple of cultures that dispute the invention of zero: The Mayans and the Indians. The Mayans came up with the concept of zero sometime around two thousand five hundred years ago, and used it in their calculations and other symbolic representations. For example, the Mayan origin of time is at 31.August, 3114 BC. This was expressed in Mayan as: 13-0-0-0-0, and could be read as 13 cycles of 400 years ago. This cycle ends on 27.December.2012, where the Mayan calendar will roll back to 1-0-0-0-0.
Unlike our numeric system, the Mayan system was base 20 (does that mean they began counting with their fingers and toes? Those sandals sure are useful in tropical climates...). Zero is used to express "the beginning before the count".
It is argued that the number zero was also invented in India and adopted by the Arabs around 1000 AD. Archeological findings (AFAIK -- AINAA), however, place the invention of zero in the hands of the Mayans at least 500 years before.
Please feel free to correct my ignorance.
Cheers!
ERe:Who invented what? (Score:2)
It is even not sure who invented the Macintosh, perhaps it was Jef Raskin [jefraskin.com] perhaps some others (even Jobs) had a good idea or two during its development too.
It's amazing that with all the people involved in these inventions still alive today, nobody quite agrees on who invented what.
Jef Raskin is certainly not amused. While I can't judge if he was the true father of the mac, his saying that the truth was far less sexy than the common myth (he a former professor in contrast to some dropped out students) seems reasonable to me. Raskin had the necessary background, and it is more probable to me that the mac was no accident but the result of a longer line of thought.
It's another matter trying to figure out who invented the first computer.
Sure, Konrad Zuse [epemag.com] together with his Plankalkül language.
By the way - that teacher has to tell, that one of the first higher languages, COBOL, was invented by a woman, Admiral Grace Hopper.
At American University (Score:1)
http://www.csis.american.edu/ [american.edu] - Department web site
http://www.clark.american.edu/~tbergin/ [american.edu] - Professor's web site
http://www.csis.american.edu/museum/sloan/html/de
Hope that helps some!
History of the Net (Score:1)
Re:Useful books: (Score:1)
What's your background? (Score:2)
While how you'd present things is your call, if I were teaching the course I'd very much try to mention the social context of computing history. For instance the influence the space program had on the development of the integrated circuit.
Outline Comments (Score:1)
Brits and emulators... (Score:2)
First off, don't neglect the British --- Williams tubes (early CRT memories), index registers, and demand paging from Manchester alone; the first well documented subroutine library, first commercial use of a computer in business (Lyons LEO), and a very influential textbook from the EDSAC group at Cambridge. (It's interesting to note that the word "page" was already used at Manchester for a unit of physical memory block-transferred to backing storage --- magnetic drum --- in Alan Turing's manual [mit.edu] for the commercialized Manchester Mk. I).
Second, emulators are available for a lot of historical systems --- Bob Supnik has his SIMH suite [tiac.net] of emulators for most of the PDP computers, and a few other early minis from IBM, DG, and so forth. Historical Unix (v5, v6, v7) is generally available and does boot on the PDP-11 emulator. He's still working on the PDP-10, for which see also Tim Stark's ts10 [trailing-edge.com], also in alpha, but already booting TOPS-10; TOPS-20 and ITS are on the todo list. (The annoying thing is that working PDP-10 emulators do exist, but are not available to the public).
There's a limit to the versimilitude here --- virtual tape never kinks up, the virtual card readers never jam, and the emulators often run an order of magnitude faster than the real machines on modern hardware. But they can still help give student a feel of the environments that people had to deal with thirty and forty years ago.
My suggestions for resources (Score:2)
Good luck with the class! --Tom Geller
IEEE Annals of the History of Computing (Score:2)
disciplines: tech, social, biz, literature (Score:2)
I'd include the following:
(1) History of the technology- all the way back
to Babbage, picking up after the WWII with the
first programmable mainframes, minis, PCs, PDAs,
and NET.
Then there is the parallel history of software.
Concepts of programming, assembler, compilers,
clients, distributed, databases, games, browsers and so on.
Proprietary until 1970s, then a mass software market and immense wealth afterwards.
(2) Engineer/programmer/hacker culture.
Books by Levy, Cringley, and Katz. "Soul of a machine" by Kidder.
(3) Business cycles- moguls and busts.
Rise, fall, and resurection of IBM. Apple.
MicoSoft. Silly Valley. Rise and fall and rise
and fall of gaming. Famous speculative bubbles
like minicomputers, expert systems, and the InterNet. Books by Stross.
(4) Literature and movies, including sci-fi.
2001: Space Odyssy, Asimov's computer and robot
stories, Neuromancer, Crytomonium(?). Lots of stuff.
Speculation versus reality.
Re:Currently doing something similar.. (Score:2)
Intrigued by this (which I had never heard of), I found a link to more information [aol.com].
--
Evan
Useful books: (Score:5)
For the history of Mathematics, invention of zero, etc. read: "Mathematics: Queen & Servant of Science" by Eric Temple Bell.
Before the internet and the personal computer, two of the major uses and research topics for computers were Cryptography, and Artificial Intelligence. Of course plain old number crunching has always been important, but I don't know of any books on that.
But for the early history and development of cryptography, check out: "The Code Book: the evolution of secrecy from Mary, Queen of Scots to quantum cryptography" by Simon Singh
For a read on the early history of AI from a nay-sayers perspective, check out "What Computers Still Can't Do: A Critique of Artificial Reason" by Hubert L. Dreyfus.
Hmmm. Possible course outline schedule:
1. History of mathematics.
2. Pre-history of computers: Abacus, mechanical machines, Pascal, Von Neumann, etc.
4. First major use of computers: number crunching. The development of mathematical algorithms.
5. The Artificial Intelligence hype of the 60's and 70's.
6. Cryptography and computers.
7. Theoretical results: Turing, the incomputability theorems, equivalence of artificial languages, "All computer languages are the same", define and describe the P=NP question. "All the interesting questions are too hard".
8. The rise of the personal computer. I think "Fire in the Valley" is supposed to be good for this but I haven't read it?
9. The rise of the internet...
Maybe about a week on each of those... you would have to move pretty quick and just hit the high points, but it would be a pretty good "tour" of computer science.
Speaking of tours, another really interesting book is: "The Turing Omnibus: 61 Excursions in Computer Science" by A. K. Dewdney. It has standalone, easy to read chapters on topics like Algorithms, Finite Automata, Simulation, Godel's theorem, The Chomsky Hierarchy, Random Numbers, Error correcting codes, Boolean Logic, Time and Space Complexity, Recursion, Neural Nets, The Fast Fourier Transform, Public Key Cryptography, Number Systems for Computing, Parallel Computing, Logic Programming, Church's Thesis, Relational Databases...
Heck, you could teach the course entirely out of "Mathematics, queen and servant of science" followed by the Turing Omnibus. That would cover everything important...
Torrey Hoffman (Azog)
I've done a little of this... (Score:2)
I've given three hours worth of lectures on the history of computing. That's an entirely different scale than an entire course, but I have some idea.
Swing by your library. There will be more books than you care to read on the history of computing. My school's relatively small library has 31 books [lib.sfu.ca].
Some ideas that you'll easily find information for in any history of computers book:
There are several courses out there [google.com].
Finally, talk to your undergrad secretary (or equivalent) and have her put you in contact with textbook publishers. A few quick e-mails that say "I'm looking for a text for a course about..." will land you more books than you know what to do with.
Greg
Culture (Score:1)
Another direction - History of programming langs. (Score:1)
The first bug (Score:2)
Other materials you really ought to check out for this class include:
Re:Who invented what? (Score:1)
The earliest clear reference to base ten notation using zero (which was a dot) is found in the Yogabhaasya (a commentary on the Yogasutra), which was composed perhaps around the 5the century. The usage suggests that the notation was in wide use already.
Some scholars think (or used to think) that Naagaarujuna's concept of emptiness (I guess many of you have heard of this: evrything is empty, etc.) had something to do with the notion of zero.
Paa.nini's grammar (around 5th century B.C.?) uses an idea of zero substitution. A grammatical element is substituted with nothing (zero). Some think it is the invention of zero, in the sence ``nothing does some function.''
Note that Aryabha.ta was not from the 7th Century B.C., (as another poster suggested) but from the 7th century A.D.
By the way, Paa.nini's grammar pretty much sounds like a OO computer language. Declarations, short statements, classes, inheritence, functions, etc. Some linguists (I believe) tried to write Paa.nini's grammar program.
Re:Jargon File (Score:1)
--
Re:The first bug (Score:1)
In RMS' abscence, I must point out that RMS is not a part of the open source movement, but rather the Free Software Movement, which has been around longer.
--
... (Score:1)
Just a thought that hit me: what about the philosophical views if CS? I think there are two big camps of CS (boy, are we over-generalisising today) :
Those that think that CS is math (von Neumann and 99% of all CS profs
Those that think that CS is language (Ted Nelson)
Back to drinking more beer.. X-)
Re:Useful books: (Score:1)
The Machine that Changed the World (Score:1)
I don't think it can be done... (Score:2)
When I say I have read histories, I mean it. I have a couple of "History of Computers" books - wait, let me grab one - - Here it is:
"COMPUTERS: The Machines We Think With" by D.S. Halacy, Jr. (Dell Books, 1962)
A very yellowed paperback, I might add. Pretty good book giving a good overview of the history and current progress (for the time) of computers. I have a ton more. Typically, what I have found to be the best "histories" of computers are books that are current for the time, showing the tech where it is at "right then".
The field is just so large - I daresay you could spend your life just gathering the information for the history, nevermind trying to organize it for printing (if it could fit in book form!). You could spend an entire book on the history of the punch card (and why the text monitor standard was 80 columns wide - they are related).
One frustrating thing you would find would be the number of dead ends - and lost data on various machines and systems. One area I have always wanted to find out more about was the hobbiest scene of the late 60s to early 70's, just prior to the invention of the 4004 - I am sure it existed, with people making their own machines from telephone relays and other equipment in their garages or basements - but there is so little to go on about such things (I have an article about how to build a telephone dialer system using simple logic circuits and a drum program system to dial a phone - a dial phone, mind you - for when a burglar breaks into your house - simple systems like that were being done, possibly more complex ones existed as well). It is hard to even find stuff from the 70's - I found one book digging through a used bookstore one how to build "The TV Typewriter" - most histories don't even mention it, but it is a big part of personal computing!
Good luck - you will need it. The best you will be able to do is skim. Perhaps let your students know this, teach them how to find out more about the topics themselves. There is so much out there...
Worldcom [worldcom.com] - Generation Duh!
From the course syllabus... (Score:2)
Objectives
The Department's objectives in offering this course is to provide an introductory to the history of computation - including early computing devices and their uses, and the people who developed them.
Ideally, students who successfully complete this course will improve their understanding of how the field of computing developed and matured. They will be expected to be aware of the principal people, places, and events that shaped their profession. Such students will appreciate the broad sweep of this branch of history and be able to relate it to the social and scientific changes that were taking place during the same time frames. They will also be able to describe the concepts and show some understanding of the developments and be able to differentiate and make comparisons between them.
Additional information about a fall 1998 section of this course - namely, a collection of additional readings used to supplement the course text - is also available.
(Please note - I have "highlighted" those parts of the text which I thought gave insight toward the scope of this particular class)
I am not saying this class isn't a good class, however, judging from the syllabus alone, it seems to do just what I said could only be realisticly done; namely "skim".
It is an introductory course, not designed to give an in-depth view of the history. It seeks to only point out "principal people, places, and events", which, while these individuals are important, probably leave out a lot of minor players who made contributions to the history of computing that weren't recognized until much later, if ever (people like Jaquard, with a card controlled loom, directly influenced Babbage, and further, Herman Hollerith, who later help found IBM, which went on to make the standard 80 column punch card, which led to 80 column video displays. I am certain I am leaving out steps here, but the point is this is one known example - there are many lesser known ones, and students of the course will never know about them).
I feel that this course seeks to point students in a particular direction. Perhaps some will go further with the knowledge gained from it, but most will simply take what was said in the course as "that is all there is", and not find out more about this particular area of study.
The syllabus admits to the history of computers having a "broad sweep", something that stands out in the course of all history. I dare to think, without having taken the course, that it probably starts with Pascal's investigations and inventions (or perhaps Napier's bones for calculations), and stops at the ENIAC era, with anything after that machine being relagated to "modern" times. But the fact is a lot of investigation into logic and calculation was made long before Napier or Pascal (indeed, look at the Antikythera Mechanism [giant.net.au] for such an example), and a lot of history has been made since the ENIAC.
Alas, I fear a lot of students will never really know about it, or care.
Worldcom [worldcom.com] - Generation Duh!
Everyone knows that Americans invented computing (Score:2)
Re:Jargon File (Score:1)
I second that motion. Follow that link now.
--
SecretAsianMan (54.5% Slashdot pure)
Don't forget the other dinos (Score:1)
Also, it might be fun to find a collector who has some classic systems and have the class meet at his house once. Just get his permission first.
--
SecretAsianMan (54.5% Slashdot pure)
History of Computing (Score:1)
Also, I taught such a course in 1999--http://instruct1.cit.cornell.edu/courses/st
and plan on revising it for the fall 2001 semester. I agree with the earlier post that you will have an abundance of materials. The Ceruzzi volume is a good basic textbook that you might use. Good luck.
Request for readthrough :) (Score:1)
Simple Solution (Score:1)
--
Existing Class Materials (Score:1)
The class was quite a bit of work, but was very rewarding. The final research project was very cool, as Prof. Edwards was very flexible about methods of submission (paper, video, web site, etc.) and topics ranged from women in computing to digital music to a brief history of Soviet computing (and yes, you can write a 3500-4500 word paper about the history of Soviet computing).
We learned about more than just the history of computers, however. This class forced me to think about how technology affects society, long before such musings became as popular as they are today. We learned about the role that census machines played in the Holocaust, about how a military boondoggle [mitre.org] supplied some of the key components to today's computing technology, how women who played such a key role in the early years of computing were pushed aside, and finally the role of technology in other countries.
Of course, this class holds a special place in my heart since I met my finacé [umich.edu] while researching my paper, so I might be a little biased :)
origins and alternatives (Score:1)
Don't forget (Score:2)
Don't forget to mention Konrad Zuse and the Z-1 computer!
Re:Don't Forget Clocks. (Score:2)
The "first bug" is a myth (Score:1)
I take it by this you are referring to the moth found jammed in a relay of a Harvard Mark II machine in 1947. Contrary to what many believe, this is not the first usage of the word "bug" to denote a problem with a machine. The term was around long before the first computer was ever built -- it's recorded as far back as 1896, and was probably in oral use long before then. The fact that "bug" was already common in 1947 is evidenced by the Mark II's conspicuously-worded logbook entry: "First actual case of bug being found" (emphasis mine).
Regards,
Cool! (Score:1)
I'm completely fascinated with computers today, but love reading the histories as well...I can't even tell you how many times I've reading about 'The Dawn of UNIX'!
Good luck, I hope this turns out for you!
-Ben
Re:Cool! (Score:1)
And that's after a preview! Way too early!
-Ben
Possible Text... (Score:2)
While it's not exactly a typical textbook, per se, CODE: The Hidden Language of Computer Hardware and Software [amazon.com] by Charles Petzold could be a handy reference. It tends to cover the history of computing relatively thoroughly, from a hardware conceptual level. It isn't a book with merely names and dates to memorize the material, instead it seeks to actually explain the insights and advances which allowed computing to take place. As the book progresses, it becomes much more technical, and perhaps not as relevant, but I'd imagine that most of the book could be useful for a course like this, if nothing else to give you some background/research material.
U. Calgary has had such a course for a while (Score:1)
I believe he also spent a year as the Guest Curator of the History of Computing wing of the Smithsonian (and wrote the last program ever plugged into ENIAC, which now rests there).
It's still running at the U. of Calgary as CompSci 509.
Prof. Williams has a History of Computing Web Site [ucalgary.ca]. ...and just clip the last directory level off that to get his own web page.
Best of Luck.
Book recommendation (Score:2)
I just completed the book. It is essentially a history of computers as it relates to philophy, math, and the nature of intelligence. Probably not the school textbook type, but perhaps good for the class recommended reading list (or your own).
Re:ABC machine (Score:1)
Of course, if you even _think_ of calling ENIAC the first I will have to find you and hurt you since Mauchly visited Atanasoff and discussed with him his ideas on this electro-mechanical marvel and even observed the uncompleted ABC where it was being built in the basement of the Physics building of ISU campus in Ames, IA.
The Smithsonian iirc has a working model of the ABC that was built recently from Atanasoff's notes and spare parts lying around the US.
This message was brought to you by a CS grad who spent 4 years at ISU being indoctrinated in how great the ABC was.
Re:Who invented what? (Score:1)
Rebel Code by Glyn Moody (Score:1)
Turing, WWII Cryptography (Score:1)
--
--
Alternate Media (Score:1)
It may not be very scholarly, but its lots of fun!
John [gray but grinning] Miller
Suggested Reading (Paper-Based, Too!) (Score:1)
One point (Score:1)
One approach that interests me in the history of computing is the cumulaive nature of applications.
What makes a computer different from other inventions is the fact tha tpeople keep thinking of major new applications for them. You could take an applicationa t a time and show how that idea started and how it developed. That way your work isn't strictly chronological, but goes back and forward. Example applications:
Originally, Computers were built to do hard sums - babbage's machines were designed to do statistical analysis in the search for ways to cheat at gambling! And that function continues today with the supercomputers predicting the weather.
data processing from the early business computers to the modern storecard systems that track every time I buy dogfood.
Personal productivity devices - the initial visicalc through to Office XP (Bleaarggh!) or PDA's
Communications devices - POS terminals, cash machines, the net, e-mail
Games machines - pong to Quake
Control devices - Apollo program
Fun toys for geeks - some Linux history in there - but also things like the altair and the Sinclairs
etc.etc.
History of Computing (Score:1)
Peloponnesian wars, typical boring history class (Score:1)
In my opinion, what is needed in this history class, as with all history classes, is to begin with a complete graphical timeline, and then to selectively "drill down" to key inflection points, and relate those points as effects of what came before and causes of what comes later.
The rapid pace of computer technology and knowledge means that a large portion of the progress came from the youth, and so to study the "culture of computing" means to study the youth of computing, e.g. Steve Jobs rebelling at Atari, the Atari 800 and Commodore 64 training the computer programmers of the 1990s, and of course the creation of Linux. Each generation, of course, also had its "old guard", and the interplay between the old and new guards are interesting. Inventions by the "old guard" include the transistor, microprocessor, and UNIX/C. These were quickly adopted by the youth and rapidly advanced, until these youth grew up and became their own "old guard", etc.
From analyzing a few of these scenarios, it should be possible to abstract (meta) out some "patterns" (and I am using this word in the "patterns of organizations of people" sense). From these patterns, it would be an interesting exercise to apply them to the present day in order to predict the future -- come up with some different scenarios. Archive the predictions, then rate yourself in a few years in the same sort of class, and refine the process. This refining of the process represents another meta step outward.
Then you'll be at CMM-3. Right now the class sounds like CMM-0 -- perhaps some good raw material, but not very effective.
See alt.folklore.computers (Score:1)
Re:John Atanasoff: Inventor of the digital compute (Score:1)
ABC History [iastate.edu]. Also a bunch of general History of Computing links.
Re:How about a little hands-on? (Score:1)
I like this hands-on idea. For the full experience, though, don't forget to give them an allotment of computer time, then charge them for more when they use it all up the first time they create an accidental infinite loop.
"If I removed everything here that I thought was pointless, there would be like two messages here."
World's first programmers (of ENIAC)were six women (Score:1)
Read about it in the following links:
http://www.witi.com/center/witimuseum/halloffame/1 997/eniac.shtml [witi.com]
(the above link has pictures of the six)
And:
http://www.wired.com/news/print/0,1294,3711,00.htm l [wired.com]
Technology History Resources (Score:1)
A class much like this ... (Score:1)
Re:A class much like this ... (Score:1)
Hah hah hah.
Metropolitan State College of Denver.
Problems with computing history (Score:1)
It goes on like that. Every area seems to have difering accounts. It will be hard to combine all of it into a single, sensible course that doesn't contradict itself.
Of course, the best place to start are books that have already been written. Check out Steven Levy's stuff (Hackers & Insanly Great), also Where Wizards Stay Up Late, Fire In The Valley, and loads more on top. Slashdot has done loads of reviews on books like these, search a bit.
Oh, and please, don't teach the falacy of the Internet. It wasn't built to survive a Nuclear attack at all.
Alan Turing (Score:1)
As many other posters have pointed out, how can you give a course about the history of computing and ignore Alan Turing ?!
Turing was undoubtably one of the great minds of the last century.
Have you ever stopped to think what the world would be like today if Alan had not been around when we needed him - the 2nd World War quite possibly would have been lost to Germany, stored program computers might not even exist or at least would be in their infancy, everythig stemming from stored computers would not exist. There are so many things we take for granted that Alan Turing through his brilliance has given us.
And for what thanks I ask you ? None, that's what. Denied when he was alive, possibly conspiratorily murdered (because of his at the time controversial & illegal sexuality (he was (fairly openly) gay)), and unknown to almost everybody now he is dead.
Strange isn't it - ask somebody who "Bill Gates" is, and chances are they know, ask somebody who "Alan Turing" is - they won't have a clue. But who is/was more important to the world.
</RANT>
---
James Sleeman
Re:Is this a BRITISH perspective ? (Score:1)
Re:Jargon File (Score:1)
Jargon File (Score:5)
--
8 hours per lecture (Score:2)
Need More Focus (Score:2)
Your description seems to lack attention to maintaining a consistent view. Working with the origins of "zero" or positional number systems on the one hand, and worrying about the a study of a subculture of *nix users is too rapid a move from the view at 50,000 feet and 5 feet. Is this a big-picture course or a detailed examination of modern culture?
The first, in many ways, is a history of mathematics. The second, modern cultural anthropology. There may be connections and contacts between the two, but not enough to meld courses on those subjects.
Since you include ANALOG computing you open the door to non-positional number system computing. The ancient Greeks (and others) made substantial use of geometry to calculate lengths, areas, etc without a positional number system. I can only guess at what the Chinese might have done.
One does wonder if the 'culture' of the slide rule as well as the 'technology' of the slide rule culture would be included in your course. I can see the chapter now: "the slide rule and the pocket protecter."
Good luck with the course design!
Another book about zero (Score:1)
Re:foobar (Score:1)
Re:Textbook or Book.... (Score:1)
Re:CS History 101? (Score:1)
culture, it's pure barbarism!
resources (Score:1)
some other sites of interest are the IEEE annals of the history of computing: http://computer.org
Virginia Tech and the NSF's history of computing: http://ei.cs.vt.edu/~history/index.html [vt.edu]
to get another audience on this you may want to ask you question on or join http://aoir.org [aoir.org] the association of internet researchers
My own work and teaching is centered more around the Internet, but it seems to me you want to look at the earliest foundation of computing, such as the origin of information theory, which is quite interesting in itself.
Jeremy
Center for Digital Discourse and Culture
http://www.cddc.vt.edu
Re:Currently doing something similar.. (Score:1)
They actualy sued the Germans for damaged material at the end of the war.
Basicaly they saw a business oportunity and took it, making it easier to kill jews in the process.
Currently doing something similar.. (Score:1)
I tried something similar to what you are going to try, or are attempting..
I have ended up writing a paper on the the life of Alan M. Turing..(basic founder of modern comp sci)
The History department wouldn't go for it the whole history of computing idea.. so before you get to far into it.. if not already..
consider the following..
1) I ran into the problem of no one in the History Dept was willing to mentor my project thesis..as keeping the
"Historical" approach to it, as well as primary materials to be used.. , and it was also so very recent...
2) Is this going to be taught as a History Class or as a Computer Science or an MIS class? As that would be three completely different classes in itself.. I am TA'n a Computer Law and Ethics class and its really sad what seniors and juniors in computer science are turning in as far as their writing skills.. analytical ability when writing.. etc.. some of them don't even know how to use a word processor, but they know how to code(well some of them), which is just sad..
3) Though a lot of computing history has been in the last 5 to 10 years.. you shouldn't spend too much time on it..
4) Break the class into topics and show the evolution of thoughts from start to finish.
A suggested Topic Breakdown in no particular order.. mostly just a braindump..
1) Introduction, what is a computer, and the originations and overview.
(Don't over dumb the "what is a computer part", or over technicalize it either)
a) The Word Meaning
b) Early Thought
c) Mechanical Computers
2) Basic Ethics of Computers, Setting a baseline to what the class will be taught on.
3) Early Computers Pre-1940
4) Birth Of the Computer as we Know it
a) The Impact of WWII
1) IBM's Role in the Holecaust
(Don't know a lot of detail, but have heard that they were hired by Germany to help them track down the Jews in pre-war years)
2) Artillery Projection
3) Cryptology
4) Massive Influx into Research Funding for the first time..
b) Post War Thought - From Concept to Reality
c) Impact of the Cold War
1) Large Numbers of "SuperComputers"
2) Need for Networking
3) Need for Mass Storage
d) From Vacuum Tube to Transisitor
e) The Impact of the Space Race
f) Academic Research
5) The major computer manufactors and their originations.. and evolution
1) IBM and their eventual monopoly
2) Intel
3) The Former Cray Computers
4) Unisys
5) Sun Micro Systems
6) Microsoft
7) Motorolas
8) Compaq DEC (VMS)
6) Birth of the Personal Computer 1979-1992
A) The major early players (Describe the Differences in Platforms (I.e. Differnet OSs, Different Processor Base, Corporate Methods, Pros/Cons)
1) Commodore (1979-1994)
2) Apple (1979-Current)
3) IBM (1981->) and its compatibles
4) Misc Others
5) The First Wave of PC computermanufactorers dies off.. (approx 1985-1989)
6) Description of Apps and Hardware Available to them
7) The role of Mainframes and networking in general during this time..
7) The Web Boom (1992-2001(current) (It may be over, or next period may be beginning)
A) The Early Web Years 1992-1996
1) Early Online Services (AOL, Prodigy, CompuServe) and BBSs How things were done before the Internet ( I personally ran a BBS through 1996)
2) Internet before the Web
i) Government Origins
ii) Telnet, Gopher, and FTP, early utils available.
iii) Birth of HTML and the Web Browser (1992-93)
Lynx, Mosaic, Netscape, and later IE.
3) The Web Initially (1992-1995/96)
i) Very Basic, Plain HTML, Slow, Expensive
ii) Very little E-Commerence
iii) Still Mainly used by Academics, Government, and Just starting to catch on.
B) The explosion..
1) E-Commernce
Created a demand for Access and Demand for Web Servers
2) Linux Birth and Unix Revival (Begins)
3) Wide Spread Access
4) Cheap PCs
5) Faster Access (Broadband)
6) Hardware Getting Exponentially better and cheaper every few months
7) Inceased Availability of Software
8) GUI Interfaces, though introduced in 1985 by Commodore on their Amiga PCs, 1986 on Mac, and 1987 or so in MS Windows. Xerox has one of the original ideas.. but they only were using it on their copiers and such..
Well I just realized how long I have spent replying to this.. and I need to get back to what I was working on.. but if you want more ideas, etc.. email me.. I have a few books.. though I wouldn't recommend the book I am currently using in the class I am TA'n, as it is more a computer ethics class book.. but it ain't bad overall..
It could be quite a project though.. have fun.. Let me know what you go with..
But as I said I gotta go and get some things done so I can get out of college this semester..
Chris Souser aka Volhav
volhav@acerbic.org
Re:Useful books: (Score:1)
BTW, that books been updated: its now called: "The New Turing Omnibus: 66 Excursions in Computer Science"
Don't Forget Clocks. (Score:2)
Almost every digital computer has a clock in it. You should devote one lecture to the history of clocks.
Some of the fancy cukoo clocks built in Europe at the height of their popularity could arguably be called the first "robotic multimedia" computing devices.
Where? (Score:1)
Re:foobar (Score:1)
I am sick learning some language and being forced to type
@foo + @bar =
or
#/usr/bin/perl -w
use strict;
my ($foo = bar);
print foo;
---
Good source of information (Score:1)
Re:Useful books: (Score:2)
The flowery, overly verbose language can get a bit annoying at times, but it is a reasonably interesting read.
Re:Don't forget the other dinos (Score:1)
"IBM's Early Computers" by C.J. Bashe, L.R. Johnson, J.H. Palmer and E.W. Pugh, 1986, MIT Press
"IBM's 360 and Early 370 Systems" by E.W. Pugh, L.R. Johnson, and J.H. Palmer, 1991, MIT Press
Both part of an MIT Press series on the history of computing: http://mitpress.mit.edu/books-in-series.tcl?series =History%20of%20Computing
The two books on IBM computers give a lot of internal IBM culture, but one should be aware that the authors were all IBM employees when the books were researched and written. This gave the authors excellent access to IBM sources, but also gives the books a decidedly IBM point of view. Caveat lector.
who is Turing? (Score:1)
Slashdot featured a link to a pretty good article [umaine.edu] that holds a very cute and short intro to how Alan Turing got to his infamous Turing Machine and the start of computers and computer science. It does a quicky mathematical history from Cantor through Hilbert to Godel and Turing.
I have studied computer science and find that certain facts about the history of computer science seem go better with the non-computer scientist audiences (the friends and family I try to explain too what computers are). Your lessons should at least cover the following topics:
Explain what the generalized Turing machine is and how it was (and still is) used to describe the 'limitations' of computing machines.
Explain with as little math as possible what NP means and what impact it has on computing.
Explain Moores law and compare it to other industries to show that computer science is something very very new in our world history.
I recommend reading "The Age of Spiritual Machines" by Ray Kurzweil. Both an interesting overview of computing history and future.
Traa
---------------------------------
Re:The history of computing (Score:1)
There's also the University of Manchester Department of Computer Science history [man.ac.uk] and "50 years of computing at Manchester." [computer50.org]
Or the Alan Turing Home Page. [turing.org.uk]
Alan Turing used to drink at the Salisbury Arms, on Oxford Road in Manchester, which although serving a decent pint, is now way too packed in the evenings to be able to think in base 32 anymore.
Re:The history of computing (Score:1)
Re:The history of computing (Score:1)
Boring!!!!! (Score:1)
Last semester our seminar course was based on Computing history. Let me tell you that it was the worst thing that I ever had to sit through in my entire life!
I ended up talking about the history of super computers, which was not that bad, but most of the people decided to talk on people like Babbage, Turing, etc.
Big waste of time!
Re:foobar (Score:2)
If you doubt it, look at the foo-less world of DOS/Winders. Their idea of metadata is to prepend "my" to everything. Would you rather have:
myvariable = "mystring";
or
foo = "bar";
I must admit, however, that my favourite metadata usage is in the php manual:
strstr(haystack, needle);
very clever and much better than:
strstr(mystring, myotherstring);
History and Culture of Computing (Score:1)
Computer Cultures (Score:2)
Don't forget UI... (Score:2)
Not that that's a comprehensive list or necessarily a good organizational conceit, but from an end-user perspective, it's what's important. (A caveat: the "end-users" of punchcard systems and such were very unlikely not to be techies.) This is, as usual, the lesson that some of the Slashdot crowd would do well to learn from Apple: not everyone cares what's under the skin of their computing environment. Not everyone knows or wants to know or needs to know.
Steven Johnson, founder of Feed [feedmag.com], has an excellent book, Interface Culture [amazon.com], which details the progression of computer interfaces and turns a critical eye on their cultural and psychological implications.
I'm reading this book now for a seminar on human-computer interaction, but it's just as applicable to the sort of course you're proposing. What I particularly like is its focus not just on the development of GUIs but also on the power of textual and hypertextual interfaces, which honestly are the only popular interface innovation of the last decade or so, since the desktop-metaphor GUIs have stagnated.
Good luck with your syllabus. Count me among the folks who would like to see what you come up with. (For the record, I really like the idea of teaching about the history of computer science problem-solving (i.e., NP complete problems, number systems, etc)
Andrew
Who invented what? (Score:3)
As for older theoretical subjects, one book you'll find invaluable is Donald Knuth's The Art of Computer Programming, in which he painstakingly traces back the history of various mathematical and computational developments.
Who invented the zero?
w/m
Had course. Having prior background helped. (Score:2)
Many of the students found this quite challenging and a bit overwhelming. I had a bit more background and so could place many items into a pre-existing framework of knowledge. I found the course very interesting and very much appreciated the context it provided to topics addressed elsewhere in isolation.
Regarding your goal, I see it as a good thing. I also support the idea of offering it to those with some prior classwork and experience under their belts. Too many people I meet these days don't have sufficient understanding of the context in which they work and create. The ranks of the old "gurus" is thinning, and their stories and hard-won knowledge are becoming scarcer. I believe giving current students some knowledge of what has come before is a good thing.
The "early days" also produced a sense of community that should be remembered. (Note that this happens in many fields, not just information systems or whatever you call it.) I recall getting a new editor on the school VAX just by asking and by being willing to "sneakernet" a tape from someone's employer. They were happy to see it used. With all the "business" going on these days, it's nice to remember the generous spirit of some of those who contributed so much to getting us where we are, now. And who found facination in the topics themselves, regardless of their financial worth.
As for books and such, I can recommend:
The Hacker's Guide, 3rd ed. -- The Jargon File plus a bit, stamped on a dead tree.
Where the Wizard's Stay Up Late -- the birth of ARPANET and subsequent things.
Don Knuth's writings. Aside from being thorough, he's informative and funny.
Good luck!
How about a little hands-on? (Score:2)
-----------
Look at the cryptography literature (Score:2)
A lot of info can be found in the history of cryptograph litterature. Many of the pinoeers of pre-digital computing were also into cryptography, Babbage broke the Vinegre cipher, Turing brok Ultra etc.
I would not spend more than one week on the pre-machine computing era but the idea that choosing the representation is the key to making systems computable is important. Place-value systems are important because they make the abacus possible
On the invention of the electronic computer there are a bunch of competing claims. Before WWII Konrad Zues built the Z1 and Z2 computers which were the first programmable electronic devices to be built and working. The Bletchley park systems were the first major electronic computational devices - although programability was limited.
You could also look at Feynman's biography where he describes the computer facility of the Manhattan project, they used tabulators for computation. [Kind of brings into questioon the effectiveness of ITAR restrictions on 'supercomputers' to 'stop rogue nations building nuclear devices' eh?]
After that it depends on whether you read a UK or US history. Both tend to imply that the other groups didn't exist - although in practice they were often collaborating. Manchester tends to be ignored in the US histories, as for that matter is the fact that the US was nowhere near dominant in computing until comparatively recently.
There is quite a bit of computer history in the book 'Big Blue' which is a biography of IBM written by the anti-trust investigator who tried to break them up.
Another era not to forget is the Micro-vs-Mainframe fight. Again the real action here was in Europe where manufacturers were building $100 computers aimed at the consumer market that sold in millions. Sinclair outsold PET, Apple and Tandy put together in those days. A lot of that was driven by the video-game scene
There are a bunch of books arround like Where the Wizards stay up late, architects of the web etc. However beware that many of these were written as corporate PR puff pieces. 'Wizards' is mainly an account of the BBN view of the Internet. 'Architects of the Web' is a hard core haigography of Marc Andressen written by the Netscape PR firm, Tim Berners-Lee is mentioned only in passing to explain how he got everything wrong. It was an attempt to rewrite history with Marc as the sole inventor of the Web.
A lot of the other computer histories tend to fit in the same mould.
Re:Who invented what? (Score:2)
It was never there to begin with.
History and Culture of Computing? (Score:2)
Computer Architecture : A Quantitative Approach by John Hennessy, John L. Hennessy, David Goldberg, David A. Patterson Morgan Kaufmann Publishers; ISBN: 1558603298
This is mainly intended for studies of computer architectures and instruction sets, but goes into a fair amount of detail with additional reading suggestions on the history of the computer. It covers from early computers all the way up to the most recent (Pentium series). It is primarily devoted to teaching computer architecture using the MIPS instruction set, but has rich information throughout on practical aspects of computing, evolution of Intel's dominance in the PC chip market, downfalls of some of the many forgotten companies that were early innovators (and computing giants for that matter), evaluation benchmarks, comparisons between Intel and Motorola processors. The list goes on and I have only read up to chapter 6. I highly recommend looking at this as part if you can find it in a library. Otherwise it is fairly pricey...but it keeps on giving. Pretty light on the culture side though.
Other than that, I don't think that this class would be complete without the introduction to Moore's law and its predictive assertions as to the future of computing. His original paper is a good start.
Cramming more components onto integrated circuits [intel.com]
Gordon E. MooreElectronics, Volume 38, Number 8, April 19, 1965
Lastly, some mention of the current efforts being made to surpass the limitations observed by Moore's law in the fields of nanotechnology and molecular computing may be worthy.
lived the history vs just found out about it (Score:2)