Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Education Upgrades

Convincing Colleges to Upgrade Their Classes? 115

Pray_4_Mojo asks: "I'm an engineering student at the University of Pittsburgh, and I'm currently taking a required class known as 'Computer Interfacing'. While I enjoy the instructor, I find most of the material to be severely dated. We will spend the majority of the class covering RS232/XMODEM/Token Ring means of computer-to-computer communication. Almost no mention of USB, Firewire, or IRDa is made within the class. I am trying to convince my professor that this material is relevant, as these types of interfaces will be dominate in the world we future grads will be working in. As an example, I demonstrated that the keycard access system to gain access to the Interfacing Lab has a USB port for data download/firmware programming. The professor seems interested, but it seems that I need to convince the department to revise the course requirements. Has anyone attempted to modernize their CS/Engineering program and met with success?"
This discussion has been archived. No new comments can be posted.

Convincing Colleges to Upgrade Their Classes?

Comments Filter:
  • Egads! (Score:5, Funny)

    by jo42 ( 227475 ) on Friday March 14, 2003 @04:44PM (#5514447) Homepage
    What next? Sneakernet?? 80K hard sectored 5.25" floppies??? Two tin cans and a string???? And this college degree is supposed to show that you are educated and get you a job in the real world?
  • by gtwreck ( 74885 ) on Friday March 14, 2003 @04:47PM (#5514470)
    It's not about whether or not you have experience in the latest tools and technologies. It's whether you have the fundamentals in place to allow you to apply that fundamental knowledge to any other system.

    In the specific case of serial interfaces, there really isn't all that much different between RS-232, RS-485, and USB or Firewire. They are all serial interfaces that employ the same fundamental concepts. In the real world you'll have to apply that knowledge to any number of serial interfaces.

    The same logic can be applied to a discussion yesterday about using MS or open source programming environments in a CS department.

    • I'll second this. It costs an unbelievable amount of money (millions of dollars) to design and test a curriculum.
      You see this in other fields as well (e.g. psychology, business etc.) As long as you're getting the concepts, it doesn't matter what the mechanics of the course are based on.
      You'll learn the real-world applications of those concepts quickly enough in the real world. :)
      • Soooo... Go to school for 4 - ? years... to learn the basics in a field, (and go into major debt doing it) then go into real world job, and are told 'forget what you learned in school, this is different'... Its the curriculum model that is antiquated. Its just not practical in many industries that knowledge advances faster than a curriculum can be made. The last thing I'm going to do is go pay to get an education about 5 year old hardware / software that by the time I graduate is not used any more! :P
        • by itwerx ( 165526 ) on Friday March 14, 2003 @05:32PM (#5514969) Homepage
          No, not at all. (Are you trolling? :)
          You are told to forget the technology which was used to convey the concepts, but the concepts are where the value is.
          Here's an example.
          If you want to learn how to fly a 747 you don't start out on one! You spend many years and tens of thousands of dollars learning on the concepts on smaller aircraft. Granted, knowing the gauge layout of a Cessna has zero relevance to a 747 but the concept of watching your fuel levels applies equally well in either case.
          So yes, when you get to 747 school they will say "forget all that other airplane stuff" but they're not really telling you to forget the concepts, just the nitty-gritty details that you don't need any more.
          Compres vous?
          • Ok, I agree with that totally. I am just griping because I am in IT, and have been working in IT for almost 6 years now. When I look at classes offered at collages, they are all old school... C programming, I mean sheesh, when I took the A+ test, they had old Apple2E questions on it, and IRQ questions!! I tell you, I memorized IRQ usage and their associations to COM ports etc., and the only time I have used this in years is on antique equipment. I guess I am expressing my frusteration , I would like t
          • by Anonymous Coward

            If you want to learn how to fly a 747, I'm betting there are about a zillion more things that you have to do before even getting to taxi to the runway, as compared to a cessna.

            On the other hand, if you want to learn how to program by learning C#.NET, all you need is notepad and a compiler. In other words, the existence of advanced stuff won't get in the way of the basics, and in the meantime, you're learning modern syntax and modern thought patterns.

            Not only that, but remember that learning the syntax is
            • You, sir, are a $10 an hour programmer, and not a computer scientist. You know syntax like a ditch digger knows his shovel. You say "syntax is vital to becoming proficient -- and employable". That's only true if you want to be a keyboard monkey, and even then, how fast you can type is just as important as the syntax. If you know the concepts, you can pick up the syntax of any modern programming language in a few days. If you don't know the concepts, you're only employable until your language of choice goes
              • by Anonymous Coward
                > You say "syntax is vital to
                > becoming proficient -- and employable".
                > That's only true if you want to be a
                > keyboard monkey, and even then, how
                > fast you can type is just as important
                > as the syntax. ....

                > What good is your education if it
                > teaches you short lived information?
                > Pay to learn the concepts. If you want
                > to learn syntax, go buy a book. ....

                > If you know the concepts,
                > you can pick up the syntax of any modern
                > programming language in a few days.

                i *did*
          • Way to copy an AC post and get a +5 for it. Good ole Slashdot moderation.

      • It costs an unbelievable amount of money (millions of dollars) to design and test a curriculum.

        I'm not sure where you went to school, but I've never studied or taught anywhere that spent ANY money on designing the curriculum. And testing it? Forget it!

        Most professors are left to their own devices to cover what they like in class as long as they hit a few basic points. For instance, compare the syllabi of the same Macroeconomics course as taught by a Keynesian and a Monetarist who studied under Milton F

        • Well, they SHOULD be spending more resources on curriculum design and testing, but as you have stated, few do.

          However, the ability to "do your own thing" varies quite greatly from institution to institution and department to department. In some places the instructor is given almost no ability to deviate from the sylabus that is handed to them. In others they are almost completely free to do what they want.

          And of course, we don't even want to get into the whole idea that university profs are hired on t

    • Yes, but *token ring*? Near as I can tell there is a vast difference between TR network protocol and your standard Ethernet stuff.

      On these grounds, I would propose that VMS be reinstated as the standard big iron system, and OS/2 be revived.

      • ...but I do remember it's fairly different from standard TCP/IP. Which would make it quite a useful platform to teach concepts, as there would almost certainly be a dedicated TCP/IP class in most curriculums anyways. It's good to demonstrate different (if unusual) concepts.

        This is why CS curriculums include not widely used in industry languages such as LISP; just because they do things radically different.

        • I concur with this, but as a basis it's probably not a good idea.

          In immediate retrospect, I'll grant the expense problem, especially in computers, where "shelf life" tends to revolve right around ten minutes for any given thing.

        • ...but I do remember it's fairly different from standard TCP/IP.

          You're trying to compare two things that aren't exactly comparable :) TCP/IP is a different layer from Token Ring in the ISO view of things... TCP is layer 4, IP is layer 3. Token Ring is layer 1 and 2, like Ethernet. In other words, you should be comparing TR to Ethernet, not to TCP/IP.

          You can run TCP/IP over a Token Ring network, just get a TR network interface card. Just like you can run TCP/IP over a serial line, FDDI, SONET, etc...

          • Bah ha! They taught you how to network using Token Ring!
            • Token ring is stil around, and some newer technologies are based on it.
              • Token ring is stil around, and some newer technologies are based on it.

                Fibre Channel.

                Well, it's not a literal descendent of Token Ring (is it?). But it's certainly a loop topology. And frequently, the primary cost of deploying a Fibre Channel SAN is in training Ethernet-centric people to administer it properly. (Indeed, the nascent iSCSI market is driven less by a distaste for expensive FC switches than by an aversion to sending one's admins to FC boot camp for six weeks.)

                Incidentally... why a loop?
      • There are lots of concepts that come and go. And then they sneak back up on everyone. Take peer-to-peer and client-server concepts for instance. They pop back and forth in fashion. Virtual machines seem to come and go also. VM's were not started with Java you know.

        Now, token passing is a valid idea. For networks it may not be used currently, but for systems that cannot withstand collisions of any sort token passing is a valid algorithm.

        My point is that just because you cannot find a way to apply a c
    • The same logic can be applied...

      Please don't be so rational - it breaks the flow of nonsense that I usually am able to indulge in when reading "Ask Slashdot" discussions.

    • No.

      USB and Firewire are vastly different from EIA-232 and siblings.

      USB is much closer to Ethernet than it is to EIA-232. I've done some serial development and some USB development, and the USB development is abstracted from hardware by several layers; while serial is barely abstracted by one layer (in microcontrollers, if you're lucky to get a UART).

      It really is different. I would agree that students would benefit from learning more modern interfaces later on, though EIA-232 is perfect for teaching basic communications concepts. I certainly had difficulty the first time I developed a USB peripheral; it had never been taught, if barely mentioned at all.

      It makes sense now. The abstraction almost makes it easier to develop for on the PC side, and there are amazing features built right into the protocol. A simple microcontroller can change from a keyboard, to a mouse, to a joystick, or dozens of other devices with a simple change in firmware.
      • It all depends on the level that you're looking at things. The connection between software and the raw data is irrelevent to some fields, and critical to others.

        I'm not actually sure how USB works. I always assumed it used the basic transfer technique that serial does for the actual communications (i.e. high = 1 low = 0, we have a paritiy bit, etc...). What you really need to know about communications though is things like different data encoding methods, handshaking, and the various protocols used f
  • by RevAaron ( 125240 ) <revaaron AT hotmail DOT com> on Friday March 14, 2003 @05:02PM (#5514615) Homepage
    When it comes down to it, the stuff you are learning is the same as all the modern interfaces. The same concepts, not much different. Sure, USB is a bit faster than 56k serial/RS232. But in the end, having the *tools* it takes to learn the stuff you will in the are those that would enable you to learn what is behind USB and Firewire with relative ease. Hell, at least half the class (probably a lot more) will probably forget the information out of lack of it being useful down the line.

    Eventually, USB and FireWire may be what is taught in that class, provided they stand the test of time like *MODEM and RS232 have.
    • Fake Assembly (Score:2, Interesting)

      by jasonrocks ( 634868 )

      I started school at BYU this semester. I'm going into CS. The first required class had a horrifying syllabus. We were to learn about C, basic electronics, and assembly language built for a theoretical computer. I was disgusted that we would learn just about NOTHING which would be practical. I transferred out of there so fast. Now, I just hope I can get exempted from that class or take a C++, Java, or X86 assmebly course instead.


      void
      • Re:Fake Assembly (Score:5, Informative)

        by nbvb ( 32836 ) on Saturday March 15, 2003 @02:57AM (#5518064) Journal
        Then you don't understand what a CS degree is good for.

        My suggestion: Go to Chubb.

        If you start thinking in the "That's not practical, who cares" mode, you belong in a trade school.

        Sorry, I know that's not very politically correct, but it's the TRUTH.

        Now, if you want to learn real computer SCIENCE, stick it out.

        Learning assembly language for a theoretical computer is a great exercise -- you have to actually exercise that mush between your ears!

        My favorite class in CS was Theory of Digital Machines.... designing AND, OR, NOT gates, building some theoretical microprocessors .... stuff that isn't "practical", but that theory means the world to you later on ...

        Again, if you want practical, go to Chubb. If you want to learn something, stick it out ...

        --NBVB
      • Re:Fake Assembly (Score:3, Insightful)

        by g4dget ( 579145 )
        I'm going into CS. [...] I was disgusted that we would learn just about NOTHING which would be practical.

        Computer Science is not job training. If you want job training, take a CISCO or Microsoft certification class.

        A good computer science program will teach you very little that is "practical"; it is expected that you can pick up C++, Java, or x86 assembly language on your own when you are done. If you can't, or if you don't want to, you are enrolled in the wrong field of study.

      • On my engineering degree programme we designed a "fake" processor using VHDL -- i.e. we learned the theory of CPUs using a relevant technology. I found that after class I was left with the interest and desire to invent my own processor. So I was learning both theory AND practical skills, at the same time as developing an interest in the subject.

        I think that learning about simple technologies is a great way to encourage students to think for themselves, to try to invent their own improvements. As others hav
      • Blimey, I have to metamod a mod on this comment that says it is "interesting". And it is interesting, but "interesting" is taken as a positive mod. The problem is that this comment is interesting because of the way in which it is so very, very wrong. I guess I'll abstain.
  • These interfaces are much simplier and are a good base for moving forward. While your not likely to use them the concepts are the same AND if you go into a lab you have a reasonable chance to use one of the older interfaces.

    That said a brief discussion of newer interfaces towards the end of the class is probably relevant.

    I know when I was in college we studied the PDP-11 architecture. Well, I've never done anything on a PDP-11 and except for hobbiests I doubt you could find one BUT I do have a solid und
    • These interfaces are much simplier and are a good base for moving forward. While your not likely to use them the concepts are the same AND if you go into a lab you have a reasonable chance to use one of the older interfaces.

      Absoululty true, I have been taught this way the poster is and it is paying off. Yestersay I has a written and verbaly assesed test on MIPS ISA and it went OK (no previous experience). This is only because I had tuition in basic computing and ISA consturcts. I would say no matter how

  • by LordNimon ( 85072 ) on Friday March 14, 2003 @05:12PM (#5514748)
    The newer technologies are much harder to learn from than the older ones. The speeds are much higher, the protocols are more complicated, and the tools are more expensive. For a beginner learning this stuff, you never want to work with the latest technologies.

    If you really want to learn about Firewire, do something with it for your Senior project.

    • Put another way, the students can easily think as fast as a CCITT v.21 connection (thats a 300 baud dial up connection) and actually follow the modulation / demodulation routines, convert each ascii character from an 8 bit string of 1's and 0's to an actual character in real time. That's like 30 per second, no problem. Actually follow the train of computer actions from sound on the telephone line to characters on the screen in their head.

      Crank the speed up slowly, give the student a chance to listen to t
    • The newer technologies are much harder to learn from than the older ones. The speeds are much higher, the protocols are more complicated, and the tools are more expensive. For a beginner learning this stuff, you never want to work with the latest technologies.

      This applies to all sorts of things. The idea of the class is not to learn RS-232 or RS-485 or RS-3.14159 or whatever. The idea is to learn serial computer-to-computer communication, and the best way to do that is to minimize time on the nuances of

  • You can teach fundamentals with Cobol and Logo too.



    A school teaching the 'fundamentals' using newer technology, like php, .NET, firewire, usb, irda, would hopefully give you a better chance of getting a decent job than one still using older technology.

    • Teaching implementations is the job of trade schools not universities. I expect MCSE's and such to have to go back to school every time a new OS comes out. I would not expect the same of CS / Engineering grads. The service the uni provides is education, not job training, an increase in knowledge / intelligence just happens to make you more valuable in the workplace. Uni is not a job training centre, otherwise you would have classes like Resume 254.
      • If students don't take the course, enrollment drops, and the university cuts funding to the department (as funding is based in part on enrollment in the department's classes).

        Students still demand courses that look good on resumes. Students still demand courses that are enjoyable and interesting. The faculty has a responsibility to teach the "right" material, but simple Darwinian survival of the department means that the faculty must teach that material in a way that gets butts into seats. One way to ke

        • Students still demand courses that look good on resumes.

          Have you ever interviewed a new grad?

          Every hotshot college grad learns very quickly that the "practical" skills you learn in school are worth squat. I've only been out of school for a few years but I code ten times faster and better than I did then, no exaggeration -- and that's still ten times worse than the best of my cow-orkers.

          Which is why I always prefer to see a solid courseload in fundamentals and theory. Then I know I've got someone who c
          • I said:

            Students still demand courses that look good on resumes.

            Sasami [slashdot.org] said:

            Have you ever interviewed a new grad?

            Of course. But I'm not talking about grads. I'm talking about students choosing courses.

            Every hotshot college grad learns very quickly that the "practical" skills you learn in school are worth squat.

            They may learn this once they're out of school - I was talking about the dynamics of students choosing courses, and how that affects an academic department's thinking.

            Which is why I

      • Moderate the parent comment up!

        Personally, I really don't think that an undergraduate classes should discuss the latest and greatest bleeding edge technology until the students have all the tools necessary to understand how the technology works. It's hard to really appreciate the power of things like Paradigm X and Technique Y until you've had to do it the hard way. Likewise, it's hard to know how to apply Paradigm X and Technique Y in appropriate ways if you haven't seen any alternatives.

        This all said, I
    • A school teaching the 'fundamentals' using newer technology, like php, .NET, firewire, usb, irda, would hopefully give you a better chance of getting a decent job than one still using older technology.

      A degree prepares you for a career, not a job. A career is a marathon, and job is a sprint. It's not about "how to do X with Y", it's about, "how to do $X with $Y" - do you see the difference? The first is like hardcoding everything into your program, the second is like abstracting all your constants into a
    • When I was an student I learned Lisp and Pascal to learn *concepts*. I never ever have used thos languages professionaly.

      In the "Real World" [tm] I have used COBOL, ALGOL, C, C++, Java, Perl and two or three more more exotic.

      I would had made no difference to my preparation as a proficient Engineer if I had learned a bigger amount of buzzwords per hour.
  • When I learned ASM I learned it on a Motorola HC11, not a P4. Learning the concepts is much more important than learning an implimentation.

    If one is unable to extrapolate the knowledge gained from studying one form of serial interfaces to another then if the course is modernized you would still be required to go back to school when the USB / FireWire / Whatever the course is taught with fad ends.
    • I learned assembly on x86. Then I learned assembly on the Microchip PIC. Then I learned motorola HC12 assembly (backwards compatible with HC11) and I thought to myself "Holy christ HC12 assembly is dumb." Maybe I just had a bad teacher. IMO PIC assembly is the one to teach in the classroom because of it's simple yet powerful instruction set.
  • by nellardo ( 68657 ) on Friday March 14, 2003 @05:19PM (#5514827) Homepage Journal
    Back in the day, in the early 90's, I was largely responsible/to blame for switching Brown University's [brown.edu] undergraduate first semester programming course [brown.edu] to object-oriented programming. It had been teaching structured programming, but industry at that time was following object-oriented design precepts (even if using languages like C). The faculty all firmly believed in OOP, and taught it in all upper-level courses. But it was seen as "too advanced" for beginning students.

    As it turned out, the real problem was not teaching OOP to the students. OOP is easier to explain to new programmers than structured programming (people use real-world objects all the time - not so much real-world procedures). Half-way through the first semester, the students could implement Tetris.

    The real problem was retraining the faculty. Even though they knew OOP was a good thing, it took them a while before they had internalized OOP enough to present, e.g., algorithms and data structures in an object-oriented style. No one believed that you could teach inheritance and polymorphism before you taught loops, conditionals, and arithmetic.

    Faculty teaching the intro courses may be in touch with industry and research. That's not enough. The faculty need to rethink an entire course to present the right academic material in a modern, industry-relevant way. If the faculty can do that (and, make no mistake, it isn't easy), they you'll get a course the students love, that will get them a job, and that will prepare them for a strong academic program as well.

    For the truly curious, the textbook [amazon.com] for that course is actually still in print, even though it depended on Borland Pascal, which is long-since defunct.

    • The real problem was retraining the faculty. Even though they knew OOP was a good thing, it took them a while before they had internalized OOP enough to present, e.g., algorithms and data structures in an object-oriented style. No one believed that you could teach inheritance and polymorphism before you taught loops, conditionals, and arithmetic.

      Of course you can teach polymorphism first.
      I'm just not (yet) convinced it is a good idea.

      The problem is that algorithms are all about
      loops, conditionals, and

      • I said:

        No one believed that you could teach inheritance and polymorphism before you taught loops, conditionals, and arithmetic.

        d^2b [slashdot.org] said:

        Of course you can teach polymorphism first. I'm just not (yet) convinced it is a good idea.

        It wasn't an "of course" ten years ago, when I was first working on the curriculum.....

        The problem is that algorithms are all about loops, conditionals, and arithmetic. The other stuff is completely orthogonal.

        I'd disagree. An algorithm works hand-in-hand with a data

        • What is an object? Behavior and data tied together. If they are inextricably related (like the red-black tree algorithm and the colorable tree nodes), work on them together, rather than trying to create artificial separations between them. That's just good software engineering.

          I think this is the essence of where we agree and disagree. Immersing people in OOP from the beginning is a good way to create software engineers, but...

          At Brown, it was a success, and led to algorithms textbooks written from

  • The real problem is, CS and CE programs should be designed from a theory stand point. Any class that relies heavily on a specific implementation or technology will be out of date as soon as you graduate (or sooner) The notable exception being classes that are designed to deal only with specific technologies or implementations or languages.

    The bulk of your education in CS should be theory, optimization and concepts. Implementation and specific technologies you should be able to pick up on the job or as a to
    • Computer Science should be mostly theory. That's why I am not taking Computer Science. I am taking Computer Engineering because it is more practical.

      Even with CE, though, it's still more important to teach the theory than the implementation. That is why I support teaching RS232 and XMODEM. I do NOT support teaching TokenRing, however, for obvious reasons. I ALSO support teaching USB, because it really isn't THAT hard to learn, and it is obviously the way of the future. There's no point to teaching FireWire
  • Unfortunately, the profs weren't very interested in listening to a stupid undergraduate. Ah well. I went around the system by using the computer club as my own class of sorts. The few folks who showed up every week seemed to really enjoy learning about how the modern stuff works. Plus, I think I had way more fun doing that than if the classes would have changed. Generally I didn't try to lecture all the time. Sometimes someone had something they'd be interested in and do some research then show the re
  • If you want to learn new stuff, the teacher has to know it as well. If the professor doesn't know it, they very well can't teach it to you. If you go to a technical school, like RIT where I go, the professors are just as up to date as the students, usually moreso. I'm a CS major, and they teach us current technologies. Taking a class on XML right now actually. If you aren't in a technical major at a technical school you can't expect to be learning anything valuable. It's like being in an engineering
  • \.

    Everything else is either based on or pretty similar to those two... well, OK, there's also Ethernet's CD/CSMA paradigm.

    THREE things, that's THREE things to learn Cardinal Fang! (And a fanatical devotion to the Pope!)

    John Slimick's the guy to learn from at Pitt; he used to teach at the Bradford campus in the frozen north. He's an excellent teacher as well as an all-around nice guy.

  • Welcome to reality. (Score:2, Interesting)

    by FreeLinux ( 555387 )
    The fact is that everything you will learn at any university is going to be dated. As has been said before, it takes a great deal of time to properly develop a new or up-to-date course. The profs have to be trained first.

    But, no matter how progressive the school is, they will still be behind the industry curve, unless they themselves are developing the technology. When you get out of school you will not have been fully educated on the latest and greatest technology. That's why you do internships and gradua
    • But, here comes the biggest kick in the huevos. Every single generation of students wants/tries to change things. Evry single one. Each one seems to feel that they know better, what should be done. But, the sad fact is that, you don't have the necessary life, business, political, technical experience to be qualified to make that decision.

      LOLLERSKATING. I worked in the computer industry for 10 years before I started college 5 years ago. From my experience, most of my professors' knowledge predates MY entra

  • industry still uses all the "old" things you mention. and all the "new" things are based on the old ones. they evolved from those standards.

    crawling is a useful skill to have. and while walking is better for many tasks, you still learned to crawl first.
    • industry still uses all the "old" things you mention.

      That's exactly what I was going to say.

      I've yet to see USB, Firewire, etc, in use in the "real world" except for consumer-level personal computer peripherals. In industry, RS232 is The Shit, RS485 is a handy substitute for long distances, and people are experimenting with ethernet (some of them are even using Cat-5). Every once in a while you might run into something designed for a parallel connection, but not too often.

      All these hot new interfaces ar
    • Yes, TokenRing is still in very limited use, but it sucks. Maybe if anybody other than IBM had developed it... And no, nothing is based on it. Maybe FDDI but there's another crazy esoteric protocol that 99% of us will never have to deal with.

      Searched the web for token ring. Results 1 - 100 of about 516,000.
      Searched the web for ethernet. Results 1 - 100 of about 5,770,000.

      Which protocol do YOU think should be taught in schools?
      • More like "definitely FDDI"... same 802.5 frame type. And dual-attached stations? I may be biased because I was weaned on Token Ring at IBM, but even today I think it's pretty damn cool--I'll take a network that has a sense of self-health over the chaos of Ethernet any day of the week.

        What I really wish, is that 100VG [io.com] had gotten off the ground.
      • Both.

        There's merit to teaching a token-based media access control mechanism.

        There's also merit to teaching a shared media access mechanism.

        If all we taught were the Most Popular (tm) results, music students would study Britney Spears, Lit majors would study Harry Potter, and CS would be nothing but how to use MS Word & Excel.

        Sheesh.

        Wait till you get to the Real World (tm). You'll grow up fast, I promise.

        TR isn't that limited in use; lots of mainframe-type environments still use it. Hell, we sti
  • by Kris_J ( 10111 ) on Friday March 14, 2003 @06:51PM (#5515638) Homepage Journal
    Updating a course is not a trival thing. Deploying new hardware, writing new course notes, finding new text books are all costly or time consuming things. Students benefit financially from stable courses as there are a greater pool of secondhand text books available. Everyone benefits from a well shaken-down course. If the fundamentals are still present and materials are still available (books and spare parts), you will be unable to convince the appropriate people to change the course.

    Meanwhile, and most students don't realise this, but you are allowed to do research of your own beyond the scope of any given class. I know you may not have the funds to pursue everything you want, but neither does the college.

  • by lkaos ( 187507 ) <anthony@NOspaM.codemonkey.ws> on Friday March 14, 2003 @06:52PM (#5515646) Homepage Journal
    There are two types of jobs out there: 1) ones that require experience and 2) ones that don't want experience because they want groom you.

    For the first group, there is no way that you could possibly gain enough experience in 6-9 hours a week for four years. That's only about 4-6 months of professional experience (about two full-time internships if you're so lucky).

    For the second group, employers are more interested in finding someone who is a good problem solver and has the ability to pick up newer technologies quickly. In a lot of ways, as an employer I'd rather have someone who learned COBOL at school for fear that they'd carry bad habits if they knew C++ or a newer language that I'd expect them to use on the job.

    <Open Source Evangelizing>
    Of course, working on Open Source software can give you the desired experience and prove you have the ability to learn quickly on your own :)
    </Open Source Evangelizing>
    • by Anonymous Coward
      I wish more people made this point. Working on free software projects is in many ways better for your skills than any paying job could ever be. You often get to work with people far, far, far more experienced than you; you get to ask questions; you get to toy and experiment with different methodologies. If you screw up, you can go back to the drawing board, learning from your mistakes. And, you get to play with all the new toys and cherry pick from the old toys. You learn about tradeoffs in different tools
  • by n1ywb ( 555767 ) on Friday March 14, 2003 @06:56PM (#5515671) Homepage Journal
    It's basicly the same here at Vermont Tech. Granted I think that RS232 and XMODEM are still relevant, as they're SIMPLE. Also RS232 is still widely used, and while XMODEM may be garbage it is still the basis of many other protocols and is easy to understand. At least I thought so. I got in an argument with a professor once, as a lab assignment he asked us to connect two computers with a null modem, set the link to 7 bits, and transfer a file using XMODEM, in that order. I told him XMODEM doesn't work over a 7 bit link. He told me it does. It took me about a half hour to convince him that he was wrong.

    After putting intense pressure on this same professor, he did spend a couple of days at the end of the class talking about USB, but it was uselessly superficial. It would have been far more beneficial for us to have done some USB programming in lab, or something.

    It is hard for schools to keep up with all of the modern hardware and software and protcols, as the industry moves to fast. But why should they keep right on the bleeding edge? While RS232 may be old, learning about RS232 teaches you the PRINCIPLES of communication, thus better equipping you to learn new interfaces. The same goes for XMODEM. USB and FireWire are pretty fucking complex protocols to jump right into when you haven't covered any time of communication standard before. But I think that considering how ubiquitous USB is becoming, it should absolutely be included in the curriculum.

    On the other hand, there's no excuse for teaching TokenRing. For the love of god, spend that time teaching ethernet.
    • To understand the present, you must understand the past. That is why archeologists painstakingly excevate ancient sites. That is why history is taught in schools. And that is why you have to learn RS232 before you learn USB. It's an obligatory point of passage.
    • Token ring is still used quite a bit in industrial settings, like oil refineries (my uncle is a manager at an oil refinery). They like to use what's been proven to work reliably, and token ring doesn't fail them. They even still use arcnet, because it works well over the distances of the plant, and it doesn't ever crap out.

      Not to mention, token ring uses a different way of communicating than any other line protocol I'm aware of. I would love to learn token ring.

      Another thing I think should be taught at
    • I took the class at Pitt and i think it was a logical step to go to token ring. First we step up communications between two computer. Then three. Then the entire lab. What is the simplest way to implement a network know that you've just learned rs232? Token ring. Ethernet has a lot more stuff going on than token ring.

      Besides there were only two or three weeks to implement the token ring stuff. And everything was in assembly. Written in DEBUG. The smarter students used NASM. We developed our own c
  • Chances are, the prof has been using these notes since the dawn of time. I had one professor who had notes that looked 30 years old, paper scrawled in illegible chicken scratch, and fraught with errors. Most of the time, the same professor will teach the same class for quite a few years because it is easier to teach something you are familiar with than learn the ins and outs of the latest technology, otherwise it would interfere with research.

    Don't for a minute think that the professors are there to teac
  • No, seriously, what happens if the lab get a lightning strike? How would the school find replacements for the outdated equipment. Sure, some of it is off the shielf (RS-232), but the token-ring stuff is getting really hard to find. Better for the school to make a planned scheduled move to current (replacable) equipment that be faced downtime in the lab while new platforms are sourced

    For what it's worth, a serial port is a serial port and RS232 is a ver forgiving place to start an interface course.

    • They can replace it with a little searching on eBay, or, more likely, they've got another closet full of the stuff just down the hall.

      In addition to the other good reasons mentioned here for starting people out on "legacy" concepts, there's also the very important in a budget conscious setting principle that if the student screws up and fries an old piece of equipment, no big deal and they've learned what not to do, whereas if it's all new expensive stuff everybody's going to be too scared of frying it to l

  • A lot of the courses had already been changed.

    But to higlight an example, the 3d graphics course in 3rd year is revised each year to adapt to new developments in the past year. For example, pixel shaders are taught in the current 3rd year course, and who knows what else next year.
    However, the second year introductory graphics course has stayed mostly the same, introducing the basic concepts which are applicable to more than one situation, and dont change very often.

    I would assume that a lot of the concept
  • by hengist ( 71116 ) on Friday March 14, 2003 @11:25PM (#5517187)
    if it wasn't for the students.

    I teach two undergraduate courses. I know what it's like to have students complaining about the content of a course, and I have two comments about this topic.

    Firstly, changing what is taught in a course is very very very very hard work, and a course that has been restructured or had its content changed is very very very likely to have problems with said new content. It is simply not practical to keep updating a course to deal with new technology. Once a course is stable, it is far better to leave it that way. Also, the staff teaching that course must spend time doing research and likely supervising postgrad students. They must do this to keep their job and to maintain the reputation of the university.

    Secondly, universities are not vocational training institutes. University teaches the basic theory and concepts behind the technology, and teaches students how to learn these concepts. The student should then be able to apply these theories and concepts in an employment situation.

    If you want to learn how to use new technology solely to apply those skills to a job, go to polytech or do a training course. Don't sit around whining to the course instructor, because frankly he probably knows a hell of a lot more about how to run a course than you do.

    • I think that's a shameful answer to an important issue. It appears that what you describe as very very very very hard work is simply an unwillingness to do what is necessary to keep your students at the cutting edge of their field. I understand that basic theory must be taught, and using tried-and-tested material is the easiest approach. However, it's also important to expose students to the most modern practices and tools in their field. This can mean the difference between them getting jobs, or being
      • DUDE, you're not getting the POINT. University is NOT FOR GETTING PEOPLE JOBS!!! It's for academic enrichment, broadening your horizons, learning things because they're interesting, and simply learning how to learn. Stop complaining and either be the change you want to see in the world, or go somewhere else. Why don't you teach your own course about the latest industry standards?

        As nearly everyone else has said... it's the concepts that matter. Whether it's at 128kbits or 1mbit, a serial communication
        • So are you saying the chem department should be teaching alchemy? It has lots of useful skills that aren't taught to modern chemsts even though most of the current glassware and chem 101 type skills came directly from alchemy (things like flame tests for gases, mixing rules, boiling procedures)...

          Sometimes the world moves on and a university should at least be near the advanced state of the art to the point where new students know where to when they are requireed to work on it. Just look at the posts her
      • I think that's a shameful answer to an important issue

        Not to be too flip, but this issue is a minor one. The important issue is how to deal with student feedback

        Reading your post, I think you missed my second point, which was:
        universities are not vocational training institutes

        it's also important to expose students to the most modern practices and tools in their field.

        No, it is not. It is important to teach students how to learn these things.

        This can mean the difference between them getting jobs,

        • by Anonymous Coward
          Reading your post, I think you missed my second point, which was:
          universities are not vocational training institutes


          This echos what was said before this comment, really. I recall my first night class (I wasn't getting up at 8AM for any class!) as a lowly EE student. The professor plodded into the hall (it was a 200 seat lecture hall), laid down his books and course material on the lectern, then proceeded to write the requirements down on the blackboard. After 5 minutes of silence except for the scraping
  • What flavor of engineering are you attendiong school to learn? There is still lots and lots of test and measurement equipment being produced that uses good old serial communications. It's cheap, reliable and still more universal than Ethernet and its friends. Even if later in life you end up using, say, TCP for your data collection, what is that TCP stream? Little more than an emulated serial connection. You want to know how this stuff works, really.
  • Learning how to think is more important than learning how to do.
  • by Anonymous Coward
    I've had similar situations in my cs degree where I found the material almost archaeological. But it's the concepts that are important to learn at the start and the roots of these concepts are in the old stuff.
  • by muscleman706 ( 654133 ) on Saturday March 15, 2003 @01:11PM (#5519704) Homepage
    I don't know about Token Ring, but RS232 is all over the place in industrial hardware like barcode scanners and other non-PC hardware. I think it is much simpler to program both for the programmers and the hardware designers. Also, remember that Intel came up with USB to sell processors because USB is a total CPU hog as compared to FireWire. So, while your PC does not have a problem with this now, certainly industrial hardware does not have the infrastructure on board to deal with USB. So, I think the appropriate thing is to talk about RS232, USB, IrA, BlueTooth, and WAP. You want BlueTooth because it is going to be in all cellphones, hence proliferate into everything else. You want WAP because for things where BlueTooth is too slow, you will want a higher-speed wireless system. For instance, you could have a WAP enabled Digital Video Camcorder that automatically pops up a recording window when you start recording, all without any wires!
  • Math (Score:4, Insightful)

    by jbolden ( 176878 ) on Saturday March 15, 2003 @01:21PM (#5519738) Homepage
    When you first learn math we don't nursery school / kindergarden with "Let Delta be a derived functor mapping abelian catagories...."; you don't learn 20th century math at all. Rather what you learn is:

    counting -- a technology that is certainly tens of thousands of years old
    arithmetic -- a technology that is many thosands of years old and was fully developed 5000 years ago
    algebra of one variable -- a technology that is a thousand years old
    geometry of 2 dimension -- a technology that is over 2000 years old.

    And if you are really good at highschool you learn
    calculus of one variable -- a technology that is over 300 years old

    By college the undergraduates make it up to about the civil war.

    ____________

    There is a difference between education and vocational training. Education teaches you how to evaluate information and how to learn new information. Vocational training teaches you specific information for a specific field. There goal is to teach concepts not technologies.

    What you are learning are very simple hardware / software interfaces. Why use complex interfaces of modern hardware that confuse the issues on an academic course? Leave that for vocational schools.
    • And how many people here know that the rules that are used to do basic arithmetic are based on a writing style of right to left. When the Europeans stole the ideas from the Arabs, they didn't know how to reverse the rules for a left to right writing style. Thats why when your taught to add and multiply its "Start on the right and work backwards". If you were to see a prhase like "one hundred twenty three plus twelve equals 135" in an old script it would look like "135 *$%*$%*$ 12 *!@#* 123". (I used the
  • If you want to do a class in current technology the best way is to go independant study. If there is an insstructor with relevant knowledge it shouldn't be that hard to get something set up.

    Consider yourself ahead of the game when you look for a job. If the university wants to move ahead in the course outline, they have "supporting evidence" for next year.
  • I'm probably in trouble on this one, but here it goes:

    I think this problem is unique to engineering. Universities were not created for this kind of thing. Thus you have an inherent conflict between (a) getting a job and (b) learning theory.

    Historically, you would get a degree *in a different field* (probably physics or math) then go to professional school, such as law school or medical school (this one would be engineering school) and learn all the latest applied technology there.

    I am not saying this i
  • by James Youngman ( 3732 ) <jay.gnu@org> on Sunday March 16, 2003 @04:44AM (#5522914) Homepage
    I think you don't stand a chance. To get the course material updated, you will have to do one of two things:-
    1. Get the lecturer replaced with someone else - this means someone else has to be willling to teach the course
    2. Find the lecturer a whole lot of free time to revise the course material (which I assume has been generated over a period of years) all at once. This probably means them taking time away from research, which is what you professor probably feels he's there for anyway.
    Even if you succeed, the material won't be updated while you're on the course. At best, the next course would start with the new material.

    (This reflects the situation in the UK, where academic teaching staff in Universities almost always have research commitments (and publications are used as a performance metric).

    Some of the material you are working with is not so bad, either. Learning about RS232 might teach you several things that are generically useful in designing system interfaces :-

    1. It's Not The Hardware, It's The Protocol Design, Stupid
    2. Race conditions between the two ends
    3. Reliability measures (Checksums, CRCs, ACK/NAK, windows)
    4. Resilience versus Bandwidth (e.g. max reliable baud rate ~ 10000/(RS232 cable run in feet))

    It's been a while since I've worked heavily in industrial interfacing, but I'd be surprised if USB is even relevant to that at all. Think more along the lines of RS422, 10baseT, and optical fibre (often carrying converted RS232, in fact).

    I'm not particularly familiar with XMODEM, but I think it's likely to help you undersand valuable facets from the above (bandwidth/reliability tradeoffs, protocol features for catching errors, latency versus throughput, bandwidth-delay products). Token Ring seems an odd choice to me, though. After all the hardware must be tricky to get these days (or perhaps your course has no hands-on component, which would make hardware availability irrelevant).

    One of the most interesting hardware interfacing things I've done was implement both ends of a mostly-symmetrical serial protocol. One end was implemented as a set of four cooperating threads, and the other as a state machine. One way of doing it was (in that case) much much easier than the other (less code and more reliable).

  • The 'out of date' technology I got to learn was Modula-2, back around '91/'92. I was mortified at the time.

    Looking back though, it didn't really matter. I went from that to C/C++, Java, Python, etc. College isn't to teach you a specific technology, but the fundamentals, and how to learn the rest.

  • When still in college, I upgraded the lab course for assembly programming. The lab course used to be PDP-11 based, and we used a semi-working MS-DOS based emululator. We upgraded to PowerPC assembly - x86 was rightly regarded as way too complicated as a starting point.

    Making the new assignments and getting everything to work on the local computer setup (this included writing an eery program that used TCP/IP to communicate between the PowerPC emulator and a custom-built terminal-emulator) took the better p

  • I see lots of comments pointing out that for specific technology skills, you would be better off at a trade school, and that general concepts based on tried-and-true, simple example technologies are what you should expect at Universities.

    Isn't a balance possible? I have been well-served by the general concepts I picked up while working through my engineering degree, but a more practical class or two would certainly have been welcome. I fully expect to be learning new technologies regularly during my care

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...