Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Education

Computer Historian? 209

mike sollanych writes: "Is there any sort of job in the world for someone who's really interested in computer history? I love it, myself, but I'm just approaching the end of high school, and it's time to make some life decisions. So, is there any place in the industry for a computer historian?" How about it? Many businesses and government agencies employ company historians to record activities which might otherwise get overlooked as mundane. What skills would a most benefit a computer historian, and where are such people needed? Does such a job exist in any but the largest of companies now? Tell us what you think.
This discussion has been archived. No new comments can be posted.

Computer Historian?

Comments Filter:
  • by Anonymous Coward
    Why don't you write them and see what's going on there. At any rate, it does sound interesting. I don't know if there's a huge market, but maybe you can start the market yourself.
  • by pb ( 1020 ) on Monday August 21, 2000 @11:10AM (#839246)
    There's a computer museum in Boston, and Bruce Sterling has written about it.

    I don't know if you could get a formal position, but by all means, start a web site! Even a lucid history with pointers to resources would be nice.

    I have a good book from ~'86 that goes over the languages and the computer internals of the day (specs on the C64 hardware, a basic memory layout of the TRS-80, etc., etc.), and I'm sure you can find more of that at your local library. I got that one from a library book sale, actually!
    ---
    pb Reply or e-mail; don't vaguely moderate [ncsu.edu].
  • Seems to me it makes a better hobby than a job; knowing your sh!t from years of experience or knowing a different way to do things because that's how it USED to work (Win 3.1, DOS, and MacOS 6.x gurus unite) can definitely help you get your "regular" job done more quickly and effectively. I know DOS has saved my @$$ a few times when Win98 has puked on me.
  • by Frymaster ( 171343 ) on Monday August 21, 2000 @11:11AM (#839248) Homepage Journal
    no, seriously, Steven Levy seems to make a decent living at it (Insanely Great, Hackers et al.)

    Personally, I've always been a bit of a computer history geek myself (as my .sig probably attests) and I'd sure as hell be willing to buy yet another book on the subject... so write it.

    My only suggestion is start at Alan Turing (or if you go back to babage, at least include him). Most people look at the pre-dawn of computers as a hardware-only affair and tend to skip over Good Ol' Al's contribution on the software front....

  • by Anonymous Coward on Monday August 21, 2000 @11:13AM (#839249)
    This is typically the domain of Information Science, a combination of Library Science and Computer Science. Particularly you would be looking at a combined degree in IS plus potentially Sociology. Your best bet would be to pursue a government-level position if possible (think Library of Congress) as these people generally have the budget to fund computer history. Also keep in mind there is a big emerging field of data/format archival (ie, how do I read an 8 inch floppy disk on a modern computer etc. etc. etc.) which extends nicely to the field of computer history and human factors. Good luck! ;chroohan
  • Good luck getting a job like that, but maybe you can set Hollywood straight. Such things like

    • Big spinning 9 track tapes have been obsolete for years (The X-files had the sense to use DAT).
    • The rows and rows of blinking lights have move to the server rooms, and are no longer on the front of the computer.
    • Everything is done with windowing systems, you don't look up a driver's license with a command line anymore (except maybe in a cop car).


    Thanks,
  • With the availability of TONS of historical data on computing at the click of a mouse, the job of a Computer Historian is pretty much obsolete. (No pun intended.)

    Years ago, when the world wasn't interconnected, this may have been a viable hobby (note, 'hobby', and not 'profession', as computer historians are generally hobbyists), but not today. Anyone can hit up a search engine, and search for....say, ENIAC [google.com] or EDVAC [google.com], and be presented with truckloads of material. The internet was spawned from computer history, so it's only natural that it has plenty of reference material regarding it's roots.

    In the age of the Internet, and it's vast amounts of computer-related historical data, a person trying to do the same job would be pretty bored.

    -- Give him Head? Be a Beacon?

  • There are documents out there that are in old formats. Regular historians and archivists are going to need a computer guy who knows how to get information off (for example) an 8" CP/M floppy.

    Our secret is gamma-irradiated cow manure
    Mitsubishi ad
  • by Durinia ( 72612 ) on Monday August 21, 2000 @11:15AM (#839253)
    ...that there wouldn't be too many jobs for someone interested in computer history anywhere in industry. It IS possible that some well-funded research department might have some interest, but you'd have to make a hard sell for WHY its good for them to be grounded in history.

    I think if its REALLY interesting to you, you should consider entering Academia and studying (and teaching) it there. I would have LOVED having a Comp. History course, but, as of yet, few professors are young enough to NOT remember when computers were a "new thing".

    I think it might be interesting to see the very specific patterns and progression of computing throughout history. If you wrote some papers on it, I'd certainly read them!

    Best of luck!

  • At the National History Museum (Smithsonian in D.C.) they have had an exhibit on computer technology for quite some time. Its really an intriguing walkthrough of the computer age up until now showing everything from Babage's machine up until the microcomputer. They seem to be constantly updating the exhibit (been there 2x in the last 4 months) so I am certain that there must be a nitch available for someone who knows there stuff.


    -*-*-*-*-*-*-*-*-*-*-*-*-*-*
  • First I'd like to tell you not to worry too much about having to make the "good decision" right now. High school teachers seem to put a lot of emphasis on the fact that you should do the "right decision" at that precise moment. I was dumb enough to believe them at that time. Trust me, I changed my mind a lot of time and studied a lot of different things before finally deciding what I wanted to do. I am just saying that so you don't feel pressure from teachers/parents, it's right at that time of your life to make a choice that is not definitive.

    To answer your questions I havn't seen anything like that, but in most computer related programs there are some sort of "history of computing" course so I can easily see that there will probably be some opening in that field in the future. If this is what interests you just go ahead and make your way...
  • Huh? I think you were confused:
    He wanted to be employed by a company as a computer _historian_
    not
    _hysterically_ (as in panicked) employed by a computer company.

  • To be honest I don't think the computer has been around long enough to warrant such a field. For most of us "computer history" is just a recent memory. Perhaps in 20 or 30 years computer history will outgrow the bounds of the first chapter in your favorite Computer Science textbook and have its own area of study.
  • It'll be hard to make a living in a field dominated by people who were actually there and make historical records as a hobby.

    Computer people are just like that. They type all day, keep an eye on the developments around them, and have good memories, so it's no big deal for them to sit down one day and type up the highlights of all the developments they have seen (within a narrow focus) in the last 20 years of their career.

    I'd as soon start a business based on creating a new desktop OS!

    ---
    Despite rumors to the contrary, I am not a turnip.
  • About all I can think of is working in a museum of some kind. If anyone out there just wants a nice reference site regarding the history of computers, the best one I have found is A Chronology of Digital Computing Machines [best.com]. It stops at 1952, but the wealth of information regarding the predecessors to today's machines is very interesting.
  • While I can't think of any (profitable) uses in industry for a computer historian, in academia, it'd be a huge bonus.
    In my college experience, the professors that were easiest to learn from (and best to take) had extensive knowledge about where computers came from (maybe because they could predict where they were going?).
    Like was mentioned in an above post, a good base in computer history can help you with current projects, excellent for beginning computer students. And every university has a required class of some kind to prove that the geeks can string a sentence together (so they can post on slashdot).
  • Another good book about the history of computers is called The Devouring Fungus [amazon.com]. It's got a good collection of actual history and urban legend (or maybe net legend would be more accurate?).
  • The idea of compiling information about the computer world has many benefits, aside from trivial interest. A concept that is not in use may have fallen out of the zeitgeist due to lack of relevance, but in 15 years could again have merit. I would consider it a mistake to make a direct comparison between computers (which do lose worth as time passes) and the concepts behind them. The ability to store and reference information regarding the changes in computers and draw parallels between the past and the present (maybe even the future, as history tends to form patterns) would be a valued and relevant ability. But, I am not going to pay you. It might have to be a government/private industry endeavor, akin to the Genome Project.
  • Comment removed based on user account deletion
  • And if writing historical stuff doesn't work out, there's always the historical-fiction-with-real-people-as-characters route, a la Cryptonomicon. And I think we need more historical fiction in the geek world. I like science fiction and futurism as much as the next guy, but there's something about historical stuff that really grabs my attention. Besides, the legends of the Elder Days need to be elaborated and told so the youngsters like me will have the proper appreciation for the days of ITS and PDPs and batch processing... And a good background in Computer History would really be a good place to start on something like that.
  • I don't know about business and getting paid, but if I had a sawbuck for every guy I've dated who filled his apartment with antiquated legacy hardware under the aegis of "creating a computer museum" I'd buy a couple of really nice dinners at the steakhouse... :>
  • And computer screens generally don't have BIG progress bars or countdown timers. Remember the dramatic "uploading virus" scene in the spaceship, or the "countdown to arrival" scene on Air Force One in Independence Day? Boy, those Powerbooks sure are versatile machines, aren't they :)
  • by Durinia ( 72612 ) on Monday August 21, 2000 @11:21AM (#839267)
    If you're interested in history, you might be interested in perusing the wealth of information at Iowa State's John Vincent Atanasoff Archive [iastate.edu]. It has some great information on the inventor of the Electronic Digital Computer.
  • by yakfacts ( 201409 ) on Monday August 21, 2000 @11:22AM (#839268)

    IMO, students should be required to take detailed courses in computer history. Why? Not for the trivia, but to understand why decisions were made, and what has been tried before.

    Too many students come out of school thinking they know it all, but understanding only a tiny bit of computers beyond the present generation for which they learned to program. Understanding the computers of the past would be useful.

    Alas, I have found no such position, or I would apply for it tomorrow.

  • Computer history sounds real interesting. I have also heard of organizations that collect old computers and try to preserve them and their software so that part of history isn't lost.

    Geeky.org [geeky.org]
  • If that's the way you feel about it, what do we need mechanics for? I'm sure you could go on the internet and locate all the information you'd need to fix your car, no matter what's wrong with it (well, maybe not if you blew it into millions of pieces, but you get the idea). Heck, why hire a network administrator when the befuddled users could just get the information they need from the internet?

    Oh, that's right. Not everybody has the time to find it themselves, and some are just too lazy or too dumb to know where to look. Having somebody on hand who knows from personal experience how to do the job right, especially with archaic computer hardware, is a very valuable thing.

    Andrew Walbert
  • Is there any place for a computer historian? Yeah: maintaining legacy systems is all about history :)

    baldeep

  • I just threw away an old cdrom "cartridge" thingy & a Vic 20 tape drive he could have had for his collection.
  • There are other good books too. Alan Deutschman's _The Second Coming of Steve Jobs_ and Peter Wayner's _Free for All_ come to mind. It's hard to write about technology without telling some of the history. These books aren't histories per se, they're more books that examine one area with an eye toward the future. They use the past as prolog.

    The great news is that studying computer history is dirt cheap. Old machines are sold for next to nothing. You can probably buy an old Cray 1 if you convince the old owner that you're going to take good care of it. The biggest expense will be putting in the right power equiptment and cooling machines.

    It can also be fun. I remember a good friend of mine had an old PDP-8 in his living room. He just flicked on one night and said, "Go ahead, it doesn't need to boot." I still remember the speed of core memory every day when I turn on the machine and wait for everything to load. (Of course rebooting is a real bear. He needs to reload the OS with a paper tape!)

    My favorite folks in computer history are the ones who make emulators of old machines. It's possible to have a virtual Commodore 64 on your desk if you run the right emulator. That means you can run your old software without futzing over old hardware. This kind of virtual collecting is pretty cool and arguably the right way that cyber savvy folks should be collecting. Who wants to get into the artificially introduced scarcity of physical goods? Lets leave that for the baseball card and stamp fanatics.

  • Ha, Ha just become a java programmer since it is pretty much history....
  • I graduated not long ago with a B.S. Degree in history, and a strong interest in computers. What it gained me was a position with a smaller company that had need of an IT person who could keep the old Tandy machines communicating with the new HP pavillions, NT Server, and the like. In all likelyhood the ability to understand and work with the recently or not so recently obsolete technology is of far more use to the word than simply archiving the hows and why's of PCs past.
  • Computer history, broadly defined, will probably be a growth industry
    in academia. The prospective of the properly trained
    historian/sociologist/economist is different from that of the
    participants. For examples, the key decision makers during the
    Vietnam war have written books about what they did, but the best books
    on the topic are by professional historians (usually academics) who do
    a much better job than the participants in synthesizing the relevant
    information and being relatively objective.

    If you combined computer history with some policy work, the project
    could be really cool. History can teach great lessons about what to
    do (and not to do) in the future.

  • Even a lucid history with pointers to resources would be nice.

    Lucid means clear. Lux means light in Latin. Sorry for picking on language.

  • It would be great to have a true, accurate and complete record of all the history of the computer industry, esp. before Microsoft changes it all (as we have seen they have tried).
    You also have to watch out for the Smithsonian - they seem to not always have their facts straight either (http://www.yale.edu/scimag/Archives/Vol71/Tesla.h tml) (great article - you can see the 'patent wars' have been going on eversince the patent system was started).
    So a private, complete, subjective history of all the computer happenings, devoid of corporate influence, would be a really Good Thing.
  • by BigBlockMopar ( 191202 ) on Monday August 21, 2000 @11:31AM (#839279) Homepage
    What skills would a most benefit a computer historian, and where are such people needed? Does such a job exist in any but the largest of companies now?

    Nah, I doubt it. I'm sure that most of the old information and stuff that might have been on vintage machines has long been rendered obsolete or transferred off onto a newer computer.

    Antique computers, like antique radios or antique TV sets, will never have any value except to Hollywood for use as props and as toys for hobbyists and collectors.

    Let's face facts: my Trinitron uses a lot less power than my 1954 General Electric TV set. The Sony has stereo sound, a remote control, goes beyond channel 13 and - get this - it's color! But the old GE is a really neat piece of history, and while I only ever turn it on every now and then, it has a prominent place in my living room.

    Now, here's a funny thing: ubiquitous as the TV set is, it has, perhaps, been a victim of its own success. There are less pre-WWII TV sets out there now than there are Stradavarius violins. 1950s and 1960s TV sets are getting rare, too. People tend to hang onto old radios because they're usually rather small or have more decorative cabinets.

    There are lots of antique radio museums and collectors around the world, but there are only a handful of antique TV collections. (One of the best is the MZTV Museum in Toronto [mztv.com].)

    Early computers are even less useful, from a practical standpoint, than a 40-year-old TV set; at least anyone can figure out how to use the 40-year-old TV, but few of us here could use even a 20-year-old computer effectively. Old TV sets often had gorgeous woodwork and great polished brass and chrome accents that were futuristic for their day. Early computers had that sort of retro feeling of "high-tech" too - a plastic prop out of the movie "Tron". But they lack the handmade qualities of earlier antique electronics.

    So, what's the fate of my Commodore 64 in twenty years? Cherished museum piece that people will love to turn on, try out and admire; or will it be reviled and ridiculed for its age, simplicity and primitive design?

  • by kootch ( 81702 ) on Monday August 21, 2000 @11:31AM (#839280) Homepage
    I'm sure that there would be considerable interest in the future, if by studying the future, you discover trends that facilitate discovering future products and technologies.

    Let me state that again. Look at game designers. There are some very good game developers and companies that spend serious money by looking at old games to determine how successful they were from different aspects and trying to determine why they were successful.

    If you like the history of computing, I'd say to try find an application of it that looks at the computing of yesterday to determine what the computing of tomorrow will be like.

    How to do this? Research, writing articles, and create a demonstrated need. Show companies what they'd gain by reading your articles and getting your opinion in their R&D.

    It's a neat idea. Takes some work, but there will probably be a strong demand for it in the coming days.
  • There's a computer museum in Boston, and Bruce Sterling has written about it.

    I don't think this is the case anymore, at least as far as computer history is concerned. The Computer Museum History Center broke off from The Computer Museum in Boston as early as 1996 or so and moved to a building on Moffett Federal Airfield in Mountain View, California. You can visit, but you need to get clearance to get onto the site.

    Website: http://www.computerhistory.org/ [computerhistory.org]

    Also of interest (and closer to Boston): the Retrocomputing Society of Rhode Island. Their website is here [osfn.org]. There are more museums scattered here and there, but I believe these two (and, perhaps, the Smithsonian) are the foremost.

    --Tom

  • by Anonymous Coward
    Yea, good thing we have all those Mechanic Historians coming out of college, or we could never fix our old cars!

    You are so right. We desperately need computer historians. Otherwise, how will I get by when my PDP-7 breaks down!? I might have to buy a $20 pocket calculator to replace it. What a waste!

    Thank you, God bless you, and God bless the United States of America.

  • ... is a large university. Especailly ones with strong Computer Science departments.

    Part of the duty of such a historian would be to help provide a means of translating old stored data to newer mediums. As well as similar duties for the transcripton of old code.

    If industry has percieved a serious need for this, then they have probably pulled few monitary strings, and courses on such subjects might be appearing. A lot of colleges work closely with industy to determine what skills need to be taugt. (I'm not a CS major, so I can't speak from personal experience for that field).

    It also makes sense that a Univeristy might have enough space and funds to maintain a library/museum of old data and data storage devices. If only to maintain thier own records and to provide (sell) this service to smaller companies that can't afford to do it themselves.

    Failing that, I can guess with about 90% certainty that the Smithsonian, or Library of Congress (which collects all forms of media) might have deparments dedicated to the history of computers. (If not, they should!)

    Good Luck with your search!

  • Computer history is all about patent prior art searches. You could be one of the guys lawyers hire to find examples of the way things used to be, so that you can demonstrate that something someone is trying to patent is not really a new idea at all.

    Take EE in college and you'll be on your way.

  • Plenty of other posters are speculating on what one could do as a professional computer historian, but a better question is how you'd get there. May I suggest that you find a university with a CS department that offers lots of ethics courses and then joint-major with History. Or get a CS degree and a diploma or MA in museum curatorship. The later is available at Trent [trentu.ca] where you can get a Computer Studies [trentu.ca] degree and a Museum Curatorship [trentu.ca] diploma in just 4 years.

  • by bubbasatan ( 99237 ) on Monday August 21, 2000 @11:35AM (#839286) Homepage
    You know, you certainly could ask Al Gore for a job as a computer historian. He needs all the help he can get proving that he invented the internet. You could be on CNN tomorrow telling everybody, "I was there with Vice President Gore when he invented the Internet. I helped him bind the servers into one connection. If it weren't for Mr. Gore, there would be no internet. The built-in CAT 5 data port in Al's neck allows the Father of the Internet to jack into his child every day...." and so forth, ad nauseum. Everything I say is just because I'm a history major forced into the IT world. These computers are so naughty, with their fancy Illudium processors.
  • While it might be nice if it were dead. It most certainly isn't. And belive it or not I know plenty of government agencies that still have windowless TN3270 terminals for database lookups.

    Personally I'd like to have rows of blinking lights on my workstations : ( I see no reason that had to die. Why!?! Why!?!
  • /. readers' two favorite institutions, Microsoft and the NSA, both have museums.

    Microsoft Museum [microsoft.com]
    The Microsoft Museum is mainly focused on the history of Microsoft, although it does have quite a bit of information and exhibits on the history of computing and computer software. The whole place is decorated in Microsoftie colors. It's located on the Microsoft campus in Redmond, WA. Unfortunately it's not open to the public, but I got to attend a party there while I was interning for The Great Satan of Software. However, they do have a fairly nice website that's available to the public.

    National Cryptologic Museum [nsa.gov]
    The NCM is run by the NSA and is located on/near Ft. Meade in MD. It gives a good overview of the history of crypto and includes a lot of information on early computing and the role it played. They also have a small public library with plenty of old books that deal with crypto. It's open to the public and has a gift shop where you can buy plenty of things with the NSA logo on them.

  • by pb ( 1020 )
    I was using the english word "lucid".

    lucid (lsd)
    adj.

    1.Easily understood; intelligible.
    2.Mentally sound; sane or rational.
    3.Translucent or transparent. See Synonyms at clear.

    [Latin lcidus, from lcre, to shine; see leuk- in Indo-European Roots.]

    lucidity or lucidness n.

    lucidly adv.


    I was going for the first definition; unfortunately, there is no convention for this in English, so I will use the convention used in unix(7) man(1) pages.

    Even a lucid(1) history* with pointers to resources would be nice(1).

    *BASH_BUILTINS(1) SEE ALSO bash(1), sh(1)
    ---
    pb Reply or e-mail; don't vaguely moderate [ncsu.edu].
  • Here's [userland.com] a very relevant article criticizing most accounts of computer history. This fellow basically says that most people just tell stories with classic heroes like RMS. I think a computer historian would be a great profession, since its very badly needed.
  • This might prove illustrative.

    As a young man, I was flirting with becoming a dedicated coin collector and I was already an avid photographer. The Smithsonian had a position available for just such a person. The position required studying their collection, documenting it both in words and photos, and acting as a resource person for all things numismatic. Even the educational requirements weren't too high which is understandable since many/most/nearly all top-drawer numismatists are largely self-educated.

    The catch? The job paid, IIRC, about USD$15K per year. Living in DC, that would have meant camping out under a bridge somewhere.

    My point? I imagine that many of the same forces would be at work when it comes to a position as a computer historian. Such a job would be fascinating but the market value of such services would likely be low. The people who would employ you would be doing it as a service to the hobby population. As a self-employed person, you might be able to deal in collectible computers, if an when such a market ever develops. (And it's out there, actually. There are people paying premium dollars for rebuilt Tandy 102s and the like. But it's certainly not yet a huge market.) I think the best feedback so far is from folks suggesting academic pursuits.

    Of course, if old computers suddenly start to get fashionably "collectible," all bets are off.

    :-)
  • There is IMVHO no justification in any business to focus yourself completely on the history of computers. Simply due to the fact that its history is written down from a to x (x being undefined here) and despite the fun you may have by checking up on older computers the information it gives you is just obsolete.

    Take for example XT based computers. Believe it or not but these machines are still being used by some corporations due to the simple fact that they can handle do what they need to do and its a waste of money to upgrade 'm with other machines when there is no need. But when there is an error and such a machine does crash the company sure won't call out for an 'historian' to tell them what they need in order to fix it. Thats simply too expensive. Its far more easy to buy a new PC (which is offcourse downwards compatible with XT machines (iow; any other PC)) and off you go.

    I'd advice you not to focus on this subject from a 'business point of view' but I sure would advice you to spend some time on this subject in your spare time. If you like the idea of seeking information on older computers and also focus on the way they work you can learn much more about the whole 'computer aspect' then you can learn from studying books like "computer internals" and such. Its much easier to begin simple (XT) and work your way up to the current machines. Offcourse this will take more time then it does to study a book, but it sure will pay off more. For starters it will give you more insights in the working of the machines, which automaticly can lead to more insights on the internals of OS's which can lead to..... Well, I guess you get the picture :) Good luck!

  • by pb ( 1020 )
    Thank you; it's been a while since I've been to that museum.

    Their main attraction back then had to do with calculating path length between cities in Pascal on a very *large* computer; back then, it looked pretty impressive! :)
    ---
    pb Reply or e-mail; don't vaguely moderate [ncsu.edu].
  • This could be a graduate thesis for Anthropology or Philosophy/Sociology of Science if CS Depts won't take it. And the academic discipline of History is more interested in technique than content, so they'll let you do anything that requires the right kind of research.

  • I hope that's what the poster intended. I would really be concerned if he had meant:
    Even a turbid history with pointers to resources would be nice.
    or perhaps
    Even an obfucated history with pointers to resources would be nice.
    ________________
    They're - They are
    Their - Belonging to them
  • He has an interesting book titled Artificial Life: A Report From the Frontier Where Computers Meet Biology that covers the history of AI (slanted toward alife).
    Cheers,

    Rick Kirkland
  • by Mignon ( 34109 ) <satan@programmer.net> on Monday August 21, 2000 @11:41AM (#839297)
    IBM has a public gallery space in their NYC office. At one time there was some display of historical computers (made by IBM, of course.) You may want to contact them.
  • As someone with a history degree (yes, from that Beserkeley place), I'd have to say stick with the computer part. History majors make just a little more than social science majors (most students get a history degree so they can take the easy path into law school), and we've all seen what CS majors pull coming out of a good school. There are a couple of good books on the history of computers (I've read most of them, but not all). Start with Campbell-Kelly and see which eras you prefer to dive into. Most of the folks on Slashdot better get out there and get a copy of Peter Wayner's Free For All (good stuff Maynard), put that on the best seller list and Uncle Bill will come out and say that (like Octavian) he doesn't want to be the richest, most powerful man in the world (remember what happens next, hint - Augustus, 1st Roman Emperor). If you insist in pursuing a history degree, you can always sell games (paper ones) or teach (even funnier), or do some accounting, and hope someone believes enough in you to let you play in their database (that's sort of the path I've gone - not recommended for the faint of heart). But best of luck (maybe someday someone will pull an Atlas Shrugged and we won't have to worry about business and money anymore).
  • If you're serious about this:

    * Get a good general education. Learn to WRITE. Being a historian is an academic job, and you're going to have to write papers. Even if you avoid that horror, you're going to have to write grant proposals and such.

    * Take some journalism classes. Learn to write for a popular audiance. If the history thing doesn't work out, you can become a pundit.

    * Learn the history of science and technology. It's fascinating stuff, and it will put the history of computers and related technologies into perspective.

    * And take some computer science courses! Programming I and Programming II (or their equivalents), data structures, and most importantly a Operating Systems course. Both of the OS courses I took had a LOT of history built in.

    Stefan

  • In the Seattle area there's a company called RE-PC [re-pc.com]. It's a fascinating warehouse like place that has bins and bins of old computer parts for sale that would give any hardcore geek's historical recall a run for his money. The place is run by a bunch of irritable and cranky guys who are tired of answering stupid questions. I go in there periodically and spend hours pouring over the bins trying to identify parts and picking up a few necessities for the reasonably up to date systems I have at home. In the back of this place is an amazing little computer museum. Stuff like an original field testing kit for the *OLD* IBM hard drives and vintage system parts from computers that existed before I was born. All of the old systems that we used to love as kids are there as well. You really have to go there and see it for yourself.

    I always though that if they marketed it better (and housed it in a nicer looking building) it could be a draw unto itself. What it needs is an energetic person who can build it up and market it the way it should be done. Of course you would have to sell the owners on it, but if you have a good vision it wouldn't be that hard. I don't think it would be a HUGE draw right away, but it would break new ground. Perhaps it could even be the home of the AFIK first computing hall of fame. The possibilities are only limited by your energy and vision...
    --
    Quantum Linux Laboratories - Accelerating Business with Linux
    * Education
    * Integration
    * Support
  • I attend RPI. Computer Science majors here are required to take a class called "Computer Organization", which is a combination of computer history and the fundamental architecture of processors and memory and such. The two subjects go well together, because as we learn what cache is and how it makes stuff faster, we can learn about Von Neumann. I would be surprised if there wasn't a similar class in many other colleges.
  • You're comparing Computer Historians to Mechanics. Okay, let's run with that analogy.

    Walk up to a Mechanic, and ask him what year the first car was built, where it was assembled, and what kind of engine it had. Or ask him what year the Bel Air was introduced.

    Now, switch places. Walk up to your Network Administrator, and ask what year ENIAC was built, and what it was used for. Or ask what the first commercially available personal computer was, the year it was introduced, and how much it cost.

    Pretty stupid, eh? Your argument is flawed.

    Mechanics have a skill, and are trained to provide that skill as a service. History is simply reference information. You can easily find the history of the automobile without consulting a mechanic, just as you can locate historical information without the assistance of a Network Administrator. MY point is, you don't need a Computer Historian either. It would take more trouble to contact a Computer Historian, than to hit a search engine and find what the information yourself.

    -- Give him Head? Be a Beacon?

  • ....And I know him! His name is Sheldon Hocheiser, and he's a graduate of Reed College (Portland, OR). (Which is how I know of him, me being a reedie too.) So, Technical Historians do exist! I don't know about "computer historians" specifically, though. There is a Microsoft Museum on MS' campus, though.. Good luck, Blue
  • From my own experience, Stanford [stanford.edu] is definitely a good choice. You've got a world class computer-science department alongside incredible "fuzzy" liberal arts departments. Since we are right in the middle of Silicon Valley and the technical departments are so strong, course offerings in the economics, history, and sociology departments often have a technology slant. Last year, there was a seminar [stanford.edu] which detailed the history of the Valley. A course in the communications department on the effects of digital media (which I couldn't take because class size was limited!). And even the computer science department hasn't forgotten its history. Gates (ugh) Computer Science [stanford.org] building is full of computer artifacts, notwithstanding Don Knuth [stanford.edu] :)

    If not Stanford, find another top computer science department: UIUC, Carnagie-Mellon, MIT, etc. If the CS department is strong, it will flow into other departments who want to ride the wave.

    As far as coursework goes, most schools allow majors to be designed if they don't have one which follows your exact path. Definitely take some CS courses to broaden your knowledge of technology, but a couple of history and economics courses wouldn't do any harm either. Just remember to get as much out of college as possible, since it's only four years.

    OK. I'm on my soapbox; but I am a senior, so nostalgia has set in. Good luck and feel free to e-mail me if you have any questions.
  • For most of us "computer history" is just a recent memory.

    Yes, I will never forget what a hottie Ada Lovelace was in her younger days. I really miss her.

    For the sarcasm impaired: Even if you are an aging boomer who was writing for mainframes back in the 60's, most of "computer history" happened long before you were born.

    Perhaps this kid can teach you a thing or two after all, v4mpyr. :)

  • I would have to argue,

    Computers, while relatively new, are a rapidly changing medium. And one that has entered into just about every facet of modern life. (There are people who study the history of mass media!)

    To some all the subtle changes from one system to the next might be a practice in detail.

    But ask any engineer, and they'll tell you, that for a project of any significantly large enough scale, the devil is in those details! And there is probably a good number of people who are willing to pay for someone who knows how to find those details.

    ^$1 for the chalk mark, $49,999 for knowing where to put it!^

  • We definitely need a computer historian to record the motives behind important descisions. This would help us to discount 'time proven' (haha) standards, such as sendmail, to clear the road of baracades preventing us from moving on to better things, such as Postfix and qmail. Another example would be moving past telnet to ssh.

    Another place for computer historians would that is crucial at the present time, would be to record the status of the industry and related industries along with, again, the motives behind descisions, in relation to technology laws, such as the DMCA and UCITA to keep a clear image of why they were passed and why we should abolish them.

    --Drew Vogel
  • The Computer History Graphing Project [sourceforge.net] -- it looks like it needs some work, but we'll see.

    There's lots of other info out there too, like FOLDOC [foldoc.org], which could probably be incorporated into a project like this.
    ---
    pb Reply or e-mail; don't vaguely moderate [ncsu.edu].
  • It is true that there's heavy info available already, but I still see the need for computer historians. They should be the ones trying to set the record straight. Many oft-quoted peices of our computer history are reported 50 different ways in 50 different places. Computer historians could sort these out, do real research, and present us with a more probably true picture of where we came from.

    As in any "somesubject historian" profession, certain historians will over time gain respect for being very accurate and knowledgeable, which will lend credence to their judgements on the reality of computer history.

    I'm all for it. I'd love to know for sure the details of a lot of those hazy-ish computer legends I hear about all the time.

    Now is a really good time to start as well, since there's still a lot of founding fathers alive to be interviewed.

  • They have a mini museum [ibm.com] of the earliest of early IBM days when it was three separate companies. I'm sure someone there could help at least get you started.

    -psxndc

  • Have you looked at the various computer timelines out there at the moment? Ever compared them? If you have, you will notice that they tend to contradict each other, or leave out some parts of computing that others include.

    Which came first, the Z1 or Colosus?
  • As someone who is both a Computer Scientist and History (and is getting degrees in both, and a honors thesis in History), the bad news is that the prospects at this moment are grim.....

    The great news is that the prospects in 10 years will be great, esp with things like the NSA, Echelon, and other nice nasty things out there, they will increasinly play a role in history, and conversly, historians will have to know about CS to tell it as it is.

    Unfortunitly, the typical definition of history is 25 years ago. This is just now hitting the Computer age. Give it a bit yet...
  • by MaximumBob ( 97339 ) on Monday August 21, 2000 @11:51AM (#839317)
    Forget about "the industry." Become an academic.

    Yes. Picture it. Spend your days on a college campus, teaching classes on the history of computers. You just come up with some random BS thesis on the ways in which computers have affected and changed society, and run with it.

    The advantages? It's tough to get a job as a colleger professor, but once you do, you're good to go. Plus, you spend the rest of your life around college-age women.

    Come over to the dark side, Luke.

  • Seems like more of a side project for a professional journalist. Like what the cringe did with Revenge of the Nerds, only bigger and longer running. But I think you're unlikely to find someone that's willing to hire you just for a project like that, without you being proven in the field at all.
  • Getting a job as a historian is a little higher class than screen writing. Just figure out what the de facto state religion is and support its myths with your writing, whether "fiction" or "historical" -- it doesn't matter. Only when you figure out what the dominant elites want to hear about and want to forget about, can you make a career of being a historian -- otherwise you might face bankruptcy or even jail time (although things haven't gotten all that bad in computer history yet). For a few examples, here are the sorts of histories you don't want to write about if you want to make money as a computer historian:

    http://www.geocities.com/jim_bowery/potc.html

    Sherwin Gooch's Account of John Bardeen's Lecture (Score:1)
    by Baldrson [mailto] (jabowery@netcom.com) on Tuesday December 28, @08:58AM EST
    (User Info [slashdot.org]) http://www.geocities.com/jim_bowery [geocities.com]

    In any case, I'll check with Sherwin Gooch to see if he has any more direct evidence from Bardeen himself to support the controversial account of the hide-away experimental stand.

    I did, and here is Sherwin's response:

    Jim,

    Thank you for alerting me to your discussion.

    To provide a more solid foundation, one should be aware that I heard this story from the horse's mouth.

    John Bardeen himself gave a talk one evening at Altgeld Hall on the University of Illinois campus, circa 1978, in which he related various experiences surrounding his inventing the transistor. At the time, people suspected that the scheduling of this presentation may have been related to Bardeen's health.

    Professor Bardeen showed us the B&W 16mm film BB&S had made at Bell Labs immediately after they got the first transistor to work (and, presumably, before Bardeen's boss got to work the next morning...) I have seen individual frames and out-takes of this film since, but I don't know if the entire film still exists. The "rolly-cart" with their experimental set-up is plainly in evidence on the film.

    It was John Bardeen himself, at Altgeld Hall, who related that his boss had said that the "solid-state amplifying device" which they wanted to develop was "not feasible," and that, "even if it were possible, it would have no practical application." Dr. Bardeen related that sometimes, when his boss stayed at work past 5 p.m., the three of them would become very impatient waiting for him to leave so they could roll their setup out of the coat-closet, and get busy on what they, apparently, thought was the greatest "cool hack" of the day.

    I wonder who Bardeen's boss was. His boss should be immortalized in history next to the NASA manager who advised the last engineer withholding approval of the Challenger launch to "put on your management hat!"

    One of the anecdotes John Bardeen related was how he had left his set of photographic slides in the taxi which took him to the ceremony to collect his Nobel prize, and all the trouble to which he and the Swedish government had gone in trying to recover them. But their efforts were unsuccessful; the slides were never recovered. Professor Bardeen was extremely apologetic that he didn't have them to use in his presentation, and so we would just have to make-do with his relating the incidents to us.

    With my background in computer music, I found one of the pieces of supporting paraphernalia that Dr. Bardeen didn't lose in Sweden quite interesting. He brought along a transparent plexiglas box, approximately the shape of a 6" cube, with randomly distributed 3/4" or so holes (apparently for cooling?) in the sides. On the top were a number (6 or so) of black SPST N.O. push buttons. A small loudspeaker was mounted inside. (There must have also been a battery of some kind, but I don't recall it.) The box contained a collection of electronic components, their leads soldered to one-another ("tacked together"), and hanging in "free space." (He hadn't bothered to use a prototyping board or connecting strip.) There were resistors, capacitors, possibly some coils, and these ~1" long bar things (which were the transistors), of which there were 3. Dr. Bardeen explained that he had had chosen to build this device because it embodied what he considered to be the fundamental 3 types of circuit: an amplifier, an oscillator, and a filter. He remarked that he thought that pretty much covered everything you could do with electronics. Each of these had been implemented as a single-transistor circuit. Dr. Bardeen then demonstrated the device (which still worked!) by playing "a drinking song of the time, which some of you may recognize" by pressing the few buttons on top of the box in the proper sequence. He apologized because it had gone so badly out of tune (which it had). He apologetically related that he had never re-tuned it. (I'm afraid I didn't recognize the song, nor did anyone sitting around me. I believe he said he had chosen it, in part, because the chorus could be played using a minimal number of different notes. I got the impression that he was somewhat embarrassed by the song, and that's the reason he didn't tell us it's name. I wish I knew what it was.) Even though this makeshift musical instrument was out of tune, I believe the monotonicity of pitches, as one traveled from one end to the other of the row of buttons, still held. The pitches were also all still of a central musical frequency.

    Professor Bardeen then passed this device around the audience for everyone to examine, which amazed me at the time, and still does. I wish I had a picture of it. I think this first all solid-state device -- an electronic organ -- should be in the Smithsonian. After all, it contained 3 of the first transistors ever made, AND THEY WERE STILL WORKING!

    But I wax nostalgic. Jim, if your point was that Bell Labs did not support Bardeen's research into solid state amplifying devices, you are in good company; John Bardeen, himself, was certainly in agreement. If there were teams being supported to research that area, perhaps he just wasn't lucky enough to be on one of them. I have no idea. All I know is what he told us.

    Please feel free to copy this e-mail (less my e-mail address) into any discussions in which you were involved. I find it particularly upsetting when people or organizations fraudulently assume credit.

    While it is true that many research facilities can be viewed as "sand boxes" which, independent of management, enable invention, and that many great breakthroughs could not have been accomplished without the collections of tools and talent amassed therein, in reality the role played by management in R&D is much closer to what Scott Adams has chronicled in "Dilbert" than it is to any accepted management text or theory.

    Sherwin Gooch
    991227

    Nor are you advised to allow anything like this to shape your dreamwork:

    http://shop.barnesandnoble.com/booksearch/isbnInqu iry.asp?isbn=0471048852#customerReviews

    Jim Bowery (jim_bowery@hotmail.com), 46-year old network architect., August 18, 2000,
    The Rise and Fall of Midwest Computing Sans PLATO
    This is a great book, conveying much of the flavor of what it was like to be in the midwest's computing culture in its heyday of the 60's through the 70's. What it failed to do was tell the real story of the midwest's demise as computing leader of the world -- which isn't the story of Seymour's obsession with packaging over on-chip integration, as implied by this book. Rather it is the story of the failure to deploy the network revolution, now embodied in the Internet, to the mass market 20 years early on Seymour's matured hardware via the PLATO networking project at Control Data Corporation. PLATO was a $1 billion 'bet the company' investment by Bill Norris, the farmer/CEO of CDC who put a windmill pump from his Nebraska farm in front of CDC's corporate towers to remind people where they came from. That is the story of epic proportions only grazed on by this book. PLATO was ready to go to mass market, but Wall Street combined with classic middle mismanagement killed the mass market version of PLATO before it could even be test marketed -- for which it was ready. Had it gone otherwise, Seymour probably would never have left the midwest, and his supercomputer architecture would have focused more on the directions now being taken by Sun and Hewlett Packard -- except with Seymour's inimitable qualities.

    I personally worked with the PLATO project and tested a version of it that would have leased a network computer with Macintosh-like interface, including network service, for a flat rate of $40/month with capital payback in 3 years. It had everything -- email, conferencing, user-programmable electronic commerce, multiuser realtime graphics games not to mention thousands of hours of computer based education courseware for which the PLATO system was originally designed. We could get this performance because the culture surrounding the land grant colleges of the midwest, such as the University of Illinois where PLATO originated, combined with Seymour's astounding performance levels created the right tradeoffs between hardware/software. Some of us were looking forward to incorporating Seymour's newly marketed Cray-1 as the foundation for the next generation of mass-market PLATO system -- and initial benchmarks looked to provide an outstanding bang for the buck as an information utility hub -- even without some of the more obvious architectural optimizations that would help in this new kind of application of his systems. This would have shielded Seymour from the vargaries of the government-dominated supercomputer market and driven his architectures into higher levels of silicon integration faster -- possibly providing the kind of capital in the kind of organization that could have delivered on gallium arsenide's potential, unlike the disaster that occured when Seymour left his farm and went cheek-to-cheek with the military in Colorado Springs, CO.

    If you look at your Internet Explorer Help menu and select About Internet Explorer, you'll notice it is based on the NCSA Mosaic web browser and that it was developed at the University of Illinois -- right across the street from where PLATO was invented. This was no fluke. PLATO had a profound impact on the culture of the University of Illinois particularly its young students who wanted to push the envelope in networking. The NCSA also gave rise the most widely used web server, Apache, and the the founders of Netscape. The loss of possibly 20 years of 'new economy' is incalculable, but suffice to say, comparable losses have been suffered as the result of open war.

    There are a lot of anecdotes this book doesn't tell that will probably die with the people who lived the tale. Just one, to capture a bit of what will be lost to history:

    People looking for Cray Research's facility in the fields of Wisconsin could drive up to a farm house and ask where 'Cray Reserach' was located and friendly neighbor would say, 'Oh, you mean Seymour's place...' and then give directions to an area surrounded by an almost invisible network of intelligence agency surveillance equipment -- protecting what was seen as a national treasure from potential espionage. In a speech to one of these agencies, Seymour told them they could come out and protect his folks but only if they never got in the way, and that meant not even letting anyone know they were around. Well, you could tell they were around, but at least they didn't get in the way!

    --------
    Good luck -- and when they ask you to please consider castration -- tell them that went out with the pharaoh's eunichs (which has nothing to do with UNIX as obvious as such an association may be to them).

  • WooHoo! You see my point!

    I agree, Computer Historians are the ones who generate the web pages, but the Ask Slashdot article was asking if there was a *JOB* in existance for a computer historian.

    As a hobby, yes. Profession, No.

    -- Give him Head? Be a Beacon?

  • by Durinia ( 72612 ) on Monday August 21, 2000 @12:01PM (#839326)
    Agreed, but don't forget about Computer Architecture! The systems and processor architecture classes I took actually followed a historical path in teaching the concepts.

    A lot of this history can be found in everybody's favorite textbook: "Computer Architecture:A Quantitative Approach" by Hennessy and Patterson.

  • If you're local to the Silicon Valley, you might care to check out the Vintage Computer Festival [vintage.org] at the San Jose Convention Center on the weekend of 30 September. Nothing less than ten years old will be on display. You may even be able to see working models of Altair and IMSAI machines. A couple of years ago, a friend of mine brought his DEC PDP-8. And there was also the Wall-O-Mac, with every Macintosh model released up to 1993 or so.

    I plan on being there. As the owner of two SOL-20 machines (one with a Helios drive), I have a soft spot for the old machines I cut my teeth on.

    Schwab

  • I had to take this class at APU (http://www.apu.edu). It wasn't that bad of a class.

  • by table and chair ( 168765 ) on Monday August 21, 2000 @12:04PM (#839331)
    I guess I shouldn't be surprised at the often overly-pragmatic replies of many of the posters here, but there would seem to be more to this issue than transcribing old code and keeping 20 year old machines running.

    Historians not only analyze the past; they also often catalogue the present. This is vital in a field in which massive change over small amounts of time is a matter of course.

    As a designer, I'm fascinated by the effect the internet has had on the history of my discipline. When there is no physical record, there is little in the way of history beyond oral tradition. When websites are redesigned (all of them every day, it seems :P), there's no record of the progression of style or theory beyond what we remember and can tell one another. That's a far cry from the abundant paper trail design has left through the 20th century.

    I imagine everyone who works with, on or around computers has similar issues to face.

    How will future students investigate history without a physical record? The answer would seem to be found in people like the kid who asked the topic question, people who can archive, catalogue, analyze and synthesize information about the information age as it happens. There's no time for traditional history, in which we sit back years later and disect a great battle or read through ancient manuscripts in search of insight... because the record will be gone after the next daily big breakthrough.

    I think there's a great deal of promise for this pursuit. Computer historians will ensure that we will continue to be able to learn from "the experience of our predecessors, [and] to sustain an imaginative grasp of posterity*"

    *quote from Rick Poyner
  • >There are other good books too. Alan Deutschman's
    > _The Second Coming of Steve Jobs

    How do you know if it's a good book? I thought it wasn't out yet. Or am I thinking of another book.

    I'd seriously warn anyone against taking any book with a history of Apple too seriously, however. It seems that there's not a writer out there who can put his feelings about Steve Jobs aside and simply write a history of the company.

    Owen Linzmeyer (sp) did a fairly decent job of remaining detached in "Apple Confidential". But that one doesn't read like a history, so much as a collection of mostly independent essays, from which you can draw out a sence of the company's history.

    Pretty much everyone else, however, uses their "history of Apple" book as their personal soap box, either to praise Jobs for his geinus, or tell the world how much they despise the man.

    For a good example of the first, see Steven Levy's books; in particular "Insanely Great: The Life and Times of Macintosh, The Computer That Changed Everything". The title pretty much says it all, eh? Jobs is almost the messiah in this one. Or you could read Pogue or Kaplan and get much the same.

    For the other side, this "Second Comeing" book has already been widely described as a "hatchet job". Robert Cringly, in "Accidential Empires" calls Jobs "the most dangerous man in silicon valley", and compares him to the likes of Jim Jones and Saddam Hussain. And don't even bother with gil amelio's rag.

    Not having met the man, I can't say IF he's so great as Levy thinks, or the physical incarnation of evil, as Cringly would have you beleive. I suppose you could just read them all, and try to pick a middle ground to beleive. Just don't expect anything resembleing objectivity from ANY of them... with perhaps the singular exception of Linzmeyer (sp).

    john
    Resistance is NOT futile!!!

    Haiku:
    I am not a drone.
    Remove the collective if

  • by hey! ( 33014 )
    IEEE, I believe, has a transactions on the history of computing which I used to subscribe to and enjoy quite a bit when I was a member.

    I would advise going to a good University library, finding this journal, then finding articles that interest you in it. Then, figure out what the background of the authors are, or even contact them for advice. Clearly, the best procedure will be to find out who's doing the work you're most interested in and finding out how they got to do it.

  • That's one perspective, but TV and radio haven't changed much. What changes have taken place are backwards-compatible. There's no need for old TV sets, because new TV sets can do all that and more.

    But it's different when you look at IT.
    Punch cards are still in use. My overclocked Celeron has a 5.25" disk drive. My friend just bought a record player.

    Why? Because there's information in old formats that's valuable. And as long as that data's out there, the equipment that can read it is valuable.
  • When I was in school (at Worcester Polytech) I was *this* close to becoming a History of Science major. I burnt out on the CS major (after an abortive sidetrack into EE) and was looking for something, anything, that would get me out of school with a useable diploma. The classes I took in the history of science were great. I did loads of research on the early computing done by Turing to break the Enigma and other codes (a dead-end, since the details of the work were still classified at the time). I found, oddly, that put into the context of history, that science became a *lot* easier to learn. How we came up with modern physics (particle physics and quantum mechanics) actually made sense.

    Alas, this was back during the recession of the late 80's, getting a degree in the History of Science looked like it would perfectly suit me for a job saying "you want fries with that?"* So I opted for something else.

    Anyhow, there is an entire discipline out there regarding the history of science and technology. I don't know which schools are big on the history of science. But I can tell you that the big acedemic journal for the History of Science is called ISIS. I assume you can get some information out of that journal about academic programs.

    (* note to those interested in funky majors like the history of science. The truth is, if you have computer skills, you can get a job even if you don't have a major in it. I suspect that I could have gone on to have the career I have right now, even if I did get a diploma in the history of science. Oh well.)
  • Exactly, head off to a museum and ask for Katz's or Cringely's job! =P Oh wait, doh! They are WRITERS. Get a real job and try and write on the side. If you can make writing pay, do so. If you don't want a real job, try and get real rich parents, or sponge off of a rich/working wife.

    Not to be too rude here, look into media job if this is what interests you. I don't see much value in a historian at a company, except maybe if you can scam an ombudsman job for a support group. I figure that is the sort of job that makes the most sense in the real world, that might do what you want. Or try the writer thing for real.

    Oh yeah, if you go into tech writing, get used to getting flamed on /. =)

    --
    $you = new YOU;

  • I'm sure many colleges offer study in the history of computers/technology. I know Georgia Tech does. Look here [gatech.edu]. It seems pretty broad including stuff like sociology too.

    -tim
  • Exactly. Those who ignore the past...

    There are a lot of really good ideas have gone to waste over the years since the h/w wasn't capable enough to let people run what they wanted to run. Look at what we're using today. Many of the major ideas are very old. Many of the things thought of as "new" had been talked about or at least mentioned by the likes of Turing, von Neumann, Zuse, et al.

    It is very valuable to go back and read some of the old papers. I have a copy of Newman's "History of Mathematics" which I inherited from my father. This has lots of papers from major figures in computing. Ever wanted to read George Boole on logic? Or Turing's paper where he proposes the "Imitation Game"? This is a great set of books. There's also some very good papers on general mathematics in there too. It's a bit old now but is very interesting in any case.

    Another good source of information is the ACM SIGPLAN "History of Programming Langauages" conferences. Alan Kay's "History of Smalltalk" presentation at the 2nd is fascinating. And for Unix devotes it also has Dennis Ritchie on the history of C. This conference is one of the best for historical recolection. The people who did things are telling you about it.

    I also often go through my old copies of Dr.Dobbs from the 1970s and 1980s. Very interesting to see the types of thing people where proposing for microprocessor based computers Dr.Dobbs also got quite a few good papers from well know CS types. For instance there's a paper from Knuth on TeX in which he states "...I'm going to write a book about the program..." (or somesuch) and some nice articles by the Bell Labs folks on C, Unix, algorithm design (Jon Bently).

  • Oh, I certainly wasn't trying to be insightful. I don't know where THAT came from. Personally, I'm going into academia, and for completely different reasons.

    Actually, I should have tried to be serious. If you want to be a "computer historian," you're not going to go into the industry. You're probably not even going to get a job in CS. You're probably going to work in history.

  • I think there's definitely a need for computer historians. They probably belong in Universities (I don't know whether it would be in the CS or the History department, though). It's not because computer science is such a young field that there aren't some interesting questions to be studied yet.

    Possible areas of study include:

    • The genesis of the computer. Is it the brain-child of a few brilliant genii such as Babbage, Turing and von Neumann, or is it "an idea whose time was ripe"?
    • The development of programming languages. Is there some trend of evolution there? The fact that such an advanced language as Lisp is, actually, one of the oldest, is a delicate thing to explain.
    • The history of the Internet. And the puzzling question: why was TCP/IP such a success and OSI such a failure? I think Cerf, Postel &co deserve much the same popularity as Gutenberg, and they are far from it. Maybe the Internet Society [isoc.org] should open a working group on the history of the Internet.
    • The history of operating systems. This is the strangest of all. It started in chaos; then wars raged; and now it is evolving toward uniformity.
  • Old computers, depending on your view of "old", can be extremely useful. There are still lots of PDP-11's in service, which was started in the early 1970's, and there are still VAX-11/780's in service, which was started in 1977 or so. I have a plethora of VAX'es at home which range in age from 15 years or so old, whose software does most things better than the modern Linux/Windows PC, and whose capabilities are fairly comparable to fairly new PC's. The uses include programming, and things like servers, and these types of machines are very capable for this use (but not the most efficient use of power).
  • Here are a few random thoughts:
    • I can't find any references to it, but I heard on NPR some time ago that historians were concerned about losing a lot of historical info when the new President is elected, with new staff, computers being modified/erased/upgraded. So they want to quickly get it, take all the hard-drives, and archive all the emails.
    • "Learn the history of science and technology...it will put the history of computers and related technologies into perspective." Great point. This field could involve sociology & anthropology. We could really use people who understand the impact of technology on societies hundreds and even thousands of years ago. We often boggle over the wonder of the internet and think it is oh so new, but forget about the impact of similar technologies such as CB-radio, shortwave radio, telephone, telegraph. Knowing how our predecessors were affected by inventions can help us deal with them.
    • It could also dwell with aspects of cognitive science: The history of computers is a tied to a history of inventors. And that is tied into a study of how people "invent."
    • Don't forget the crazy connections [amazon.com] between computers and other technologies and aspects of life. The building of the trans-continental railroad in the U.S. had a huge impact on the lifestyles of both the american people and native americans. That was totally unforseen 50 years prior. Looking back, what unexpected results will see that computers brought on us?
    I recommend pursuing the major that you enjoy the most. Even if that isn't History you can take some classes in it and write papers about the history of technology. Perhaps a column in the school paper about interesting connections between what students use now and what they used 100 yrs. ago. Perhaps you'll end up a programmer who writes articles for a tech magazine on the side. Or maybe you'll get your PhD in history and teach. Perhaps your essays will pave the way to books that explain history for the masses in an engaging manner, like Ambrose [stephenambrose.com] or Burke [palmersguide.com]. "Information isn't powerful. Information isn't power. ... Hey, who's got the most information? Librarians do! It's hard to imagine a group of people with less power than librarians." - Cliff Stoll
    -----
    http://movies.shoutingman.com
  • Such as Paul Ceruzzi, who works for the smithsonian, and has written several books on the subject. He's also involved with SHOT, the Society for the History Of Technology [jhu.edu].

    You might also be interested in the slightly less formal Vintage Computer Festival [vintage.org], taking place at the end of September. There will be plenty of history and historians there. The VCF web site also has a long list of links to museums, collectors, etc.

    And, of course, I would be denying my own conceit if I did not mention my own collection [sinasohn.com] of classic computers.

    Computer history is a growing field, but not one that I think you could ever get rich in, any more than any other similar field. Certainly it is fascinating to look back and see just how far we've come.

  • by UncleRoger ( 9456 ) on Monday August 21, 2000 @04:08PM (#839394) Homepage

    The resident computer historian at the Smithsonian [si.edu] is Paul Ceruzzi [nasm.edu]; a very knowledgeable guy. So they already have someone, but other museums might not.

  • ...thinking they know it all, but understanding only a tiny bit of computers beyond the present generation...

    And so they blissfully* reinvent the wheel, over and over again...

    *as in "ignorance is bliss"

  • With the availability of TONS of historical data on computing at the click of a mouse, the job of a Computer Historian is pretty much obsolete.

    Oh sure, and then you find a page like this one [newmedianews.com] which is factually wrong on several levels. (The Gavilan was preceded by the GRiD Compass, and possibly the Sharp PC-5000.) So you can leave such misinformation alone, or you can rely on a computer historian to correct it.

    Meanwhile, can you find out what the first PC was? If your lucky, you might come across this page [blinkenlights.com] which will test your knowledge and probably surprise you -- it was put together by a computer historian. That same historian has done quite a bit of research into the first pen-based portable, but it's not on the web (yet).

    So don't knock computer historians, unless you don't care whether or not your history is correct.

  • Being somewhat involved [sinasohn.com] in the computer history field myself, I know several people who have made a few bucks off their knowledge -- through providing that knowledge to legal firms for use in patent cases. Prior art is a very big part of proving a patent should not have been given, and having the obscure knowledge of old systems that might have had a particular feature can be very valuable.

    And if you'd like to pick up some of that knowledge, check out the Vintage Computer Festival! [vintage.org]

  • Things are not good in the computer history business. In part because the main-line companies that felt this was important have faded into oblivion (think mainframe and mini) and the dot coms are too interested in wasting their venture capital on roll-out parties.

    The sadest example of the problem is the death of the Boston Computer Museum [tcm.com]. Strongly supported by DEC, when DEC went away, so did their funding (and yes there were other reasons including some idiots for executive directors). I was in it several weeks before it closed and a pretty sad thing to see. It has been 'moved' to the Boston Science Center.

    The actual museum for the BCM is in California and can be found at Computer History Center [computerhistory.org]. It looks to be alive and interested in history, not 'gee, look, computer interactive toys for school bus loads of children to play with instead of learning how to add, subtract, multiply or heaven forbid divide without a calculator'.

    Probably the most respected computer history place at the moment is the Charles Babbage Institute [umn.edu] at the University of Minnesota.

    In any case, learn more, subscribe to IEEE Annals of the History of Computing [computer.org] and remember that the dot comes have mostly forgotten/ignored all of this and so you can make money consulting on 'NEW' ideas that are actually old things revisited.

    --multics.

  • Jumping into this late, but I think you shouldn't worry about what's happening now, but rather what will be happening 7 years from now. My advice: go to college, double major in history and computer science (or major in one and minor in the other). Then enter a PhD program in either history, history of science, or science and technology studies. If you do everything right away, max out on courses, and finish your dissertation quickly, this will take at least 7 years. By that time more people will presumably be using computers, and more universities will probably be likely to want professors to teach history. The PhD is important because it will not only make you a professional historian, but it will allow you to get a job just about anywhere. Any large history department will probably have someone who specializes in the history of science and technology, and with a PhD you'll be able to also teach other history courses, making you a more attractive candidate for a job. There are also museum jobs, as other people mentioned, and large corporations often have positions as company historian that you might be able to get. If all else fails you can fall back on your undergrad CS work and get a higher-paying job as a programmer or admin.
    --
  • One of the academics in the CS department where I was working wrote a book about Australia's first computer, CSIRAC [mu.oz.au], which he worked on back in the 1950's.

    On a more general level, I believe that "computer history" is a job for both CS people *and* historians. Professional historians have learned a few tricks over the years about understanding the past, and trying to write history without their skills leads to amateurish, sloppy work. If historians were trying to use computers for their job, should they get help from an expert or should they try and write the code themselves?

  • Incidentally, Bob Cringely's Pulpit of 2000-07-27, Everybody is Wrong [pbs.org] recalls a bit of history of the fiascoes from the people that later became AOL.

    A historian of the computer business has to dig beyond corporate statements.
    __
  • by Maggot75 ( 163103 ) on Tuesday August 22, 2000 @02:32AM (#839431) Homepage
    Few people know that Alan Turing committed suicide after having hormonal treatments mandated by court to lessen his homosexual sex drive.

    Incidentally, Kenneth H. Rosen's 'Discrete Mathematics And It's Applications, Third Edition' (ISBN 0-07-053965-0) provides great computer history-related biographical and historical footnotes. It's also a must-read for it's coverage of um, discrete mathematics.

  • There is a typical job requiring a big computer history knowledge : Technological surveyor (From the French . Veilleur technologique).
    This job consists o being aware of the latest relevant technologies in order to advise corporate buyers about potential updates.
    Computer History knowledge is used here to help evaluate the products' advance and estimates the actual possibilities its use may bring to the company.
    Choosen products are then extensively tested and compared to currently used ones before they can be deployed in a production environment.
    Of course, I used the word product but this could also be whatever which could have an effect on the workers productivity (method, etc.).
    --
  • There is a small, but growing, collection of historians of science and technology exploring the history of computing/computer technology (I'm just halfway through my master's program here: The Institute for the History and Philosophy of Science and Technology [utoronto.ca] at the University of Toronto. There's only a couple of us doing computers, but it's a start :)

    You might want to start at the library reading the Annals of the History of Computing [computer.org]. Off the top of my head, Michael Mahoney [princeton.edu] (who started in the History of Mathematics) has done a lot.

    Historians of computing have looked at Babbage, Turing, and Wozniak, but you can start just about anywhere. The field has barely been touched - there are plenty of unexplored areas. And the great thing about the history of technology is that everybody can help: from engineers to economists.

    Myself, as a recent University of Waterloo [uwaterloo.ca] CompSci grad, I thought I'd return to my roots, and write my MA thesis about the early computer science program there. In particular, I'm thinking about looking at the birth of WatFor and the related successes achieved in undergraduate education. Hint: if you have a story to tell about Watfor, email me! [mailto]

  • I'd say his best bet (outside of the history department, like you mentioned) is to do a thesis in computer architecture or operating systems; one need only read the latest Andrew "Linux is obsolete" Tanenbaum textbook to know that there is a lot of leverage for an academic computer historian in those fields.

    If he's talking about personal computer history, he might be less lucky. Most hardware courses feature a history component which is geared toward state-of-the-art-then. If he's teaching at a liberal arts school with an integrative studies program of some kind, he could probably "switch hit", teaching both operating systems and hardware in the CS department, and then teaching (or team-teaching) a "Sociology/History of Computing" class. The sociology of computing angle would be more in tune with personal computer history; the history angle would be more "Turing and the Enigma" perhaps.

    In any case, I'd say major in CS and maybe history or sociology, too. (A philosophy major probably wouldn't hurt, either, as long as you read Wittgenstein, Church/Turing, Frege, etc.) Then find a graduate program, get a Ph.D., and get a tenure-track position in a small enough (or forward-thinking enough) department so you can implement your ideas. Once you're tenured, start pushing your more radical ideas.

    It's not a fast track, but it's a good track, and will be very rewarding if you stick it out.


    ~wog

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...