Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Education

Improving CS Education? 227

sachachua asks: "You know CS could be taught better. I've decided that I really want to get into improving CS education. My university seems like the best place to start. I like the way we do CS, but I know it can be improved. I want to find the weaknesses in our CS program. I want to know how other schools are teaching computer science - what they're doing better, what they're doing differently. Then I want to help improve the way CS is being taught and learned. I'd like to benefit from this before I graduate. (See? I have a selfish reason after all.)" Do you think that the current CS teaching practices used in college or uni are insufficient for the new century? How would you improve the curriculum and the way in which these classes are taught?

"I've been reading through the ACM SIGCSE proceedings and there are some studies I'd like to do in order to give myself some research training (we don't get much of that in our regular curriculum). We're also trying to organize peer mentoring and code clinics (but that brings up interesting ethical questions). What are other things students can do to improve the quality of their CS education?

My university's already doing some pretty cool things. One of my professors sneaked some patterns into an introductory course. The teachers are super-approachable. But it's just that I look at our curriculum and classes and teachers and I realize that there's so much more that can be done.

I want to know this:

Lots of people have probably already tried this before - improving the way CS is taught/learned. I want to know how it turned out or how it's turning out. Do you have any advice? Notes? Ideas? Know anyone I should talk to? I'm an insanely motivated geek here and I want to make sure I get the best CS education with the (rather limited) resources available. If it incidentally helps other people, then that would be a Really Good Thing. =)"

This discussion has been archived. No new comments can be posted.

Improving CS Education?

Comments Filter:
  • by Anonymous Coward
    I interview and hire a lot of developers. I've noticed over the past couple of years that the skill sets of gradutes has been decreasing. I used to have no problem finding a quality C++ developer out of shcool. Now, all the gradutes think they are master programmers because they have been doing scriping languages like Perl. I think the education the CS majors are getting should teach them the basics like algorithms and data structures but should also encourge the mastering of a language like C++. Other languages like Perl, VB, etc can be taught later.

    In the last group of interviewes I had, I asked them to describe their C++ skills. The ones who said "very strong" I asked them a couple of questions. "What is an abstract class? "and "Can a child access it's parents private data?". No one got either question yet they are "strong" c++ developers.

    With all the companies paying high salaries to moderatly skilled cs majors. The incentive to achive a good education decreases.

  • by Anonymous Coward
    One of the most important CS classes I took was taught in a MADE-UP language. A la SPIM or others, it was an assembly language. The prof explained to us the syntax of the very-simple language, and we wrote very simple programs. He then showed us more complicated aspects of the language, and we wrote more complicated programs. The fun part was - he could change the language. One week it was a single-operand language. The next it was a three operand language. We got to really see how various architectures differed in concept. By the end of the semester, we were writing assembly subroutines, stack frames, local variables, etc etc. Not only did I grasp this made-up language, but I understood this made-up machine. When I went to learn x86 asm, I realized "my god - he wasn't far off!" I already knew what was happening. This is the fundamental class that changed my path from being a Network Admin to being a programmer. Thanks Dr. Sanders.
  • by Anonymous Coward
    The big question to ask yourself is: what is your desired end goal? You say you want CS to be taught better. Why is that? Do you feel you're not learning enough? You're not learning the right things? The right things for what? To get a job in today's market, to continue into research, to innovate, to...?

    I'm in a position where I work to interview people for programming positions in C++ or Java. Sometimes we look for entry level programmers, fresh out of school. I can tell you the things that I, as an interviewer, look for.

    First, I really need someone who knows C++ or Java. That's because we program in those two languages, and we don't have time or money to train people to make the jump from something like Modula-2 (or are they on 3 now?) or Scheme or some other language. That's a practicality for us.

    I pay attention to coursework. I like asking people about the courses they took: which ones, which did they like the best, which didn't they like? What did they take as electives? I like it when people receive such basics as software design principles, algorithms, databases, and data structures. I don't expect new grads to be up on the latest technology. A lot of schools don't teach java, a lot of schools don't mention object-oriented databases... these are things that upper level courses might address, but not undergrad courses. That's okay. I want to see that people are learning the basics.

    As far as I'm concerned, here are some things that are good for people to learn in their 'computer science' training:

    • How to learn. You're not going to know everything, ever. Learning is vital.
    • How to communicate. Communication is a big, key skill in the workplace. I wish more colleges had group projects. Whenever I hear about someone working on a group project, I have lots of questions for them. The workplace is one large group project. Besides playing well with others (tm), communication involves how you express your thoughts both in groups and one-on-one situations.
    • How to design. Object-oriented design skills are especially useful in today's job market, at least. Just learning how to structure code in a logical manner, how to think in terms of objects, the basics are what I look for.
    • How to differentiate between what you know and what you don't know. This goes back to the "you won't know everything" principle. Maybe someone had a database class that touched on object-oriented databases, so they mention that in passing. I might ask more questions, when really it was just a brief overview they received. They can go on and try to fake it, or realize that oftentimes "I don't know" is a perfectly valid answer.
    Anyway, this is going slightly off-topic, in the direction of CS as the means for entering the job market. In summary, though, what I think CS programs should emphasize most are the basics: data structures, algorithms, good design techniques, etc. It's nice to have the option to learn the latest technology, and obviously the world isn't looking for a lot of Fortran 77 and Cobol programmers these days. But I don't think CS courses should necessarily worry about teaching the latest and greatest bleeding edge technology. Teach students how to think, create, organize, design, communicate, and learn. How? More basics. More group projects. More projects in general. Accessible professors and TAs, and students that take advantage of them. It's the sum of the whole that you get from a CS degree, not '15 cool uses for XML' or 'how to use java to make a drawing applet'.
  • Comment removed based on user account deletion
  • I was under the impression that the OSI model was 7 layers - Application, Presentation, Session, Transport, Network, Datalink, and Physical, while the TCP/IP model has 4 layers: Application, Transport, IP Layer(Network), Datalink/Physical.

    The latter model is usually more useful, unless you are working outseide the realm of IP (I've only ever done Net programming for TCP/IP).
  • There are a few problems with this approach.

    1. For the sake of purity, Java is NOT purely OO, because it has primitive types (among other things). For a truly OO experience, try Smalltalk :-)

    2. Not every student wants to be a high-level programmer. Teaching OO is fine, but the best students I know are those who have a good understanding of what's happening under the hood, e.g., what's going on inside a Java VM, for example. Some kids just want to hack C, not Java.

    3. It's not just the issue of OO itself, but actual Software Engineering. The problem here is good Software Engineers tend to want to enter industry, but a lot of professors that I know are more interested in students going for grad school to do research. The head of the department at my university, for example, is planning to REMOVE Software Engineering courses from the curriculum, and I think his motivation is to keep as many people from entering the industry. ACM's recommended curriculum focuses quite a bit on Software Engineering, though.
  • it's way easier to blow the system

    No it isn't. A stray pointer in asm is the same as in C -- either one will crash your program (or maybe your OS if you are using some crap like a MacOS or Win9x derivative).
  • Well that's why you need the calculus. I like calculus, a lot. I almost switched to be a math major instad of CS. I liked linear algebra too. I can't wait to take numerical analysis. =p
  • You don't even have to go to a top notch university to get professors whose actual job is teaching. Ball State University is not the greatest college, but it isn't a research university and all of the classes are taught by REAL LIFE PROFESSORS. In my two years here so far I have had no classes taught by TA's. The class sizes are pretty small and all of the professors are required to have reasonable set office hours every week so that students can go and get help if they need it. I have never had a problem getting a hold of a professor if I needed it. That's not to say that I have liked all of my professors, but that's another store. :P
  • The third major problem with our CS department is the teachers. The teachers we have are quite intelligent, and a few of them are really good computer scientists. However some of them are just not qualified. We ask them all kinds of advanced questions and they don't have answers. There are only two teachers we can take our complex problems to because the rest don't know.

    Worse than that, the others don't know how to find those answers, and if they did know, wouldn't have the attitude that they should go looking for them. This is a basic problem involving schools and schooling, and the damage which the school system does to people's psyche. The only answer to this is to replace schools with something less rigid, something that works. They are beyond reform.
  • Coursework alone will never offer as great a learning experience as having a real project to sink your teeth into.

    ``Amen, brother, amen!'' (-:
  • I think that the gender issue has to do with the fact that women are generally not encouraged as much to excel in fields of math and science.

    More than that, the aspects of maths and science taught in school (``all your mind are belong to us'') are generally taught in a way incompatible with feminine strengths. Women pick up better on things with a wider range of information and perceptual channels available (relatively speaking, men just become more confused as more data channels open). School is in the business of narrowing channels of perception, blandifying information (lopping off sidebands), and chopping data down into such small pieces that the big picture, if any, is at best elusive. School (realschule) was designed to do this specifically to stop people from learning to think [fdns.net].

    Femmes actually make better CS experts in many cases because of their built-in greater ability to sift through masses of information for specific items. This is why many wives are able to find their husbands' keys when the husband (who actually put the keys wherever they are) cannot. They pay a price in things like spatial perception (think of reverse-parallel parking).

    There are some interesting biases: femmes are often better at low-level OO because they are landmark-oriented, their opposite numbers are often better at structure because they are map-oriented.

  • Man, I'm at Georgia Tech, and even though it's all Java for the first two semesters, I've already had to implement Linked lists, binary trees, hash tables, and a bunch of other crap that you haven't mentioned. Just because you use a language that provides them doesn't mean you have to use them. And I hate my professors for not letting me use java.util.Vector, such a timesaver.

    --Xantho

  • After many years in IT, working on enterprise computing, I have relunctantly concluded that there is very much a "research-like" scientific element to extending or maintaining/debugging complex systems.

    When you are dealing with huge interacting systems with 100's of millions (billions?) of lines worth of code, its total absolute behaviour ***IS*** impossible to predict. Yes there are guidelines, requirements, previous behaviour, etc. but when a bug pops out, especially during new development, it takes forensic skill and competence to unmask the metasolutions that went into these systems that produced unexpected results. This requires "understanding" a solution that was implimented perhaps decades earlier. This understanding can be very much a scientific pursuit requiring enquiry and investigation, and sometimes even hypothesis and experimentation to uncover.

    I think CS depts in general completely ignore this aspect and rarely (if ever) teach the skills necessary to perform this task. Further, CS depts have yet to teach or promote what it means to work in a team. For those with experience working in Labs, this can be an essential element in scientific thinking because doing scientific research usually involves building on the work of others, not necessarily in the sense of results obtained, but in understanding procedures and protocols that were used in determining those results.
  • Yes,

    You're right about the application of Pascal to demonstrate the concepts of programming.

    top-down design or bottom-up? I cannot remember anymore it was so long ago.

    I thought about mentioning that in the original post, but then... I have a point to make :-)
  • They have no interest in learning - they just want the degree to go make some money.

    HA! I thought that was what college was all about? After all, why only accept students who want to learn when you can just take in the vast majority of them and charge them an arm and a leg for tuition? Seems like a good way to make money to me.

    If you were being forced to go to college by society (and/or your parents), wouldn't you want to get the best out of it--even though you don't want to be there, don't want to learn anything, and just want to make more money? By "the best out of it" I mean choosing the degree that gives you the best "peice-of-paper-to-money-making ratio".

    Higher education stopped being a place to learn for most people after the Vietnam war. It filled the job market with degrees. Eventually it became the norm--because to compete in the job market, you needed to look better than the other guy... Who had a degree--because he was drafted and then had is education paid for by the government.

    So if you really want something to blame for the lack of interest in college, blame the Vietnam war.

    -Riskable
    "I have a license to kill -9"
    http://YouKnowWhat.com
    --------------------------
    -Riskable
  • This is actually pretty normal for novice instructors. Your are clearly one of the better students from your class, because you made it to grad school. Yet when you recall your experience as an undergrad, you probably assumed that you were middle of the pack (as this study [apa.org]).

    Then you go to teach, and the top few students seem pretty decent (they're much like you) and the rest of the class seems to suck. Well, no. The rest of the class sucks as much as they ever did, only now you have to notice, because you're grading all the papers, instead of hanging out with the leet geek types.

    Crispin
    ----
    Crispin Cowan, Ph.D
    Research Scientist, WireX Communications, Inc. [wirex.com]
    Immunix: [immunix.org] Security Hardened Linux Distribution
    ----
    Research Assistant Professor [ogi.edu] of Computer Science
    Oregon Graduate Institute [ogi.edu]

  • Does the BMath make much of a difference? I know Waterloo students have to take a couple more Math courses that at other Canadian universities -- but obviously not that many more because entrance to the MMath program doesn't specify. Should other universities be looking into a Faculty of Mathematics?

  • At my uni the first session of both computing degrees (CS and CE) starts out teaching Haskell (a cool functional language). This means the students get into coding reasonably large programs straight away, and get an idea about design tradeoffs and other things that you would never get from assembly.

    Show assembly to most first years and if you don't scare them off computing immediately they'll get stuck in a mess of pointers and never get anywhere. You certainly won't be able to do any programs of a reasonable size, and you probably don't end up teaching the most important aspects of computing (like handling complexity with abstraction).

    Second session is C (used to be Java, but they went back to C because later subjects need it), and only in second year do they start seeing assembly in conjunction with digital logic (which is where it actually makes sense).

    --

  • by Anonymous Coward
    I am a computer scientist. (I am a research scientist at a lab.) I use calculus almost every day, along with linear algebra, probability theory, etc. Please:

    Computer Science is NOT Web Programming.

    Computer Science is NOT developing a "Java Enterprise Servlet Bean"


    CS is the study of theory, algorithms, developing new ways of processing information efficiently. Teaching coding in a CS program should minimal at best. Obviously, you need to know C, etc. to develop algorithms. But you shouldn't be taught how to make a "Shopping Cart" in a CS degree class. That's completely applied and should be taught in an IT program or something similar.

  • The thing is, in physics, no matter how many equations you use, ultimately, it's the physical world that decides if you are right: either particle X acts as you predict, or it does not. This empirical foundation is what makes science scientific.

    Math and Computer Science need not touch the physical world at all -- computer scientists can (and do) prove the correctness of algorithms that can only run on non-existant quantum computers.
  • Math is an invention. Physics is an observation. Math was invented in order to better observe physics.
  • Your point is well taken. However, it is also wrong. Math is a branch of physics, not the other way around. Math was developed to describe natural phenomena. Calculus was developed to describe the movement of the planets and the acceleration due to gravity. Trigonometry was developed to describe triangles & circles. Linear algebra has its roots in describing cross-sections of cones.

    As for the term "philosophy", this dates back to Ancient Greece, when all studies were considered "philosphy". It has more to do with a different meaning of philosophy--that is, the pursuit of learning. This is quite different than the modern interpretation of philosophy, where people ruminate about abstract concepts to discover absolutes.

    But, yes, your point is very valid. It is quite absurd to take all of this topics back to their roots.
  • I absolutely disagree. I've seen Java used in the first year, in a true OO-from the ground-up approach, to amazing effect. Done right, the students are capable of much more interesting coding (both in "personal interest" and "interesting CS fundamentals") in the first year than they ever would have been in a procedural/modular style program.

    The problem in question is that your instructor(s) had no idea how OO should be taught. Even if one can program in an OO style, an instructor used to teaching old-school structured or modular style often results in a machine that makes broken students.

    It's nigh-impossible to express "the right way" in a /. comment, but I'll point out that Brown University (formerly using OO Pascal/Delphi, now Java) and Virginia Tech have excellent programs that do OO from the first day. They aren't the only programs that do it right, either. It's a matter of OO requiring a very different teaching style. Brown's old book (Object-Oriented Programming in Pascal, A Graphical Approach, by Conner, van Dam, and Niguidula) didn't hit control structures until chapter 10 or so. Earlier chapters dealt with OO conceptual fundamentals, including non-programming modelling exercises (think in objects), object instantiation, method invocation, composition, and inheritance. Sounds rough? Nope, it's natural as water in the way and order they teach it.

  • While there are many cases where Calculus is not used, when working in Graphics and Sound, I've found that all that math comes in REAL handy. I know that math isn't something everyone finds second-nature, but it's really worth it, and I wish I had done better in some of my calc classes, looking back.

    If you never expect to write your own graphics or sound library, it may not be as useful :-)
    As a simple example, vector calculus is quite useful in 3D graphics programming
  • So here's my question: What is the relative size of these two groups? Are they roughly the same? Or is the second group (the one that
    generally has trouble with pointers) larger?


    The latter is larger, but for a different reason. It was only a couple semesters after I took the class that they changed it. This is my 5th year in school, so most who took it with me or before me already graduated, while there are 3 1/2 years of students who took the less intense version.

    now-a-days rarely-used aspects of programming

    What, are you suggesting that pointers are a rarely used aspect of programming? Pshaw.

    But, in my day-to-day work, I almost never find myself needing that ability.

    Well, I spend most of my time in Verilog these days myself (which isn't a programming language, but looks like one from a distance). But when I need to simulate something fast, I write it in C.
  • I think the first thing you should teach students is assembly for a simple RISC instruction like MIPS. Why? Because it teaches you what all programming languages are fundamentaly made up of -- the things the underlying hardware can do.

    When I took it, our introductory computer engineering course taught assembly, followed by a little C. Later, they changed it to just C, and also changed the 2nd course from C to C++.

    Here's what I noticed (in general) regarding the two groups:

    Those who took the class with me are as a whole not afraid of pointers. They understand the difference between a memory location and the contents of that memory. When they get to it, they have little trouble adapting to higher level languages, because they understand how those languages work.

    Those who took it later are uncomfortable with pointers. They might not even know what one is-- even enough to explain pass by reference. They have a difficult time moving from C++ or Java to something lower-level, such as C.

    The problem with teaching OOP is that then they only know OOP, and some would argue that OOP isn't even all it's cracked up to be. If you know assembly, you can code in anything.

    In fact, at least in my case, I didn't truly understand C++ until I learned assembly. I did it in the wrong order, and it was hard.
  • The main problem I had with CS at uni (in the UK btw) was that the courses they were teaching were not keeping up with current technology. This is unavoidable to a certain extent, since those teaching the course have to learn all this new stuff before they can teach the rest of us. I started in 94, and that was the first year they started teaching C++. They still didn't have a proper course on Java when I graduated in 98. Unless the uni is well funded and can afford to hire people with the appropriate knowledge, this is going to continue. As far as I can tell, there isn't a simple solution.

    A good proportion of the knowledge I gathered during uni was gained from working on my own projects, hacking away til 4 in the morning on some new piece of code that interested me. Had I not gone to uni this would probably never have happened. The enivronment is as important to the learning process as what is actually officially taught in my opinion. As long as you want to learn that is.
  • The head of the department at my university, for example, is planning to REMOVE Software Engineering courses from the curriculum, and I think his motivation is to keep as many people from entering the industry. ACM's recommended curriculum focuses quite a bit on Software Engineering, though.

    There are reasons for this:
    • Universities are businesses too. They need researchers to survive. Thus, they wish to recruit them
    • A university is not the ACM and it is not a place to necessarily get trained for work in the industry

    Granted, removing software engineering is probably not a good way to get students to come to the department...

    Woz
  • Your point is a good one and has a lot of merit, as does another person who posted a similar thought. I must admit, I do not disagree with you. But I must point out what I (effectively) said at the end of my post: Computer science and Software engineering are not the same thing. Do not confuse the two.

    CS is the study of algorithms. SE is the study of systems. Taking an algorithm and translating it into a procedure is more natural in C, but making good use of it is more natural in C++/Java, especially once the notion of OO has been shown to you.

    Imagine CS and SE as sets (yes, I see the irony here). CS intersect SE is non-empty, but CS minus SE (and vice versa) is also non-empty. In other words there are parts they do not share.

    Any one in SE needs the basic skills of CS. They need to understand data structures, algorithms and basic logic. After that, things diverge. For example, I'm in my last term and for all my courses this term, I have not written a line of code for a course. Yes - I am in computer science! But my courses are theory - what can computation do? how does this algorithm work? etc. I don't care at all about objects, methods or obfuscation. It's not even applicable. In fact, I don't even see how it would help you understand it any easier. Now, if I went to implement something to do with this that was not simple, I would turn to an OO type of design to organize the structure of my solution. This is the crucial difference.

    What you describe about teaching is perfect for a SE directed program. In fact, I'd argue with you that it's what it should be.

    Woz
  • There is an old joke about C being portable assembly. This is what I was referring to.

    Woz
  • Instead, university CS programs should start at the beginning. Teach assembly language. Use assembly language for everything, at least for the first two years.

    As C is portable assembly, it counts for what you are asking for.

    Woz
  • True, but when everyone is having trouble, it's a good sign they're getting bad advice or not trying very hard. And of course, after they do poorly, they blame the prof, department, etc.

    Woz
  • Somebody has written a Win32 [tripod.com] MMIX emulator, complete with source code. If anybody's looking for a project, porting this to Linux/Unix is probably a good one . . .
  • There is nothing wrong with being a COBOL programmer. Having some experience with C or assembler is very useful. Do you really understand the mechanics of your language's subroutine linkage conventions? Can you read and understand the assembler output of the compiler? Can you debug a core dump? I think all CS students should take a course in assembly language programming. Digital logic design and basic electronics are also helpful.
  • I hear your same complaints from a friend who is a history professor at a major midwestern land-grant university. He says 90% of his students are "knuckle-walking droolers who can barely string four words together".

    And this is the problem with achievment of any kind everywhere. Most people are more interested in eating, fucking and getting wasted. Anything that involves work, thought or some kind of introspection is clearly out the window.

    I see the same thing at work, and saw the same thing at University and in high school. People just aren't interested.
  • Ug... We learned Pascal and Fortran77...

    Neither of those two got me anywhere. I expect things have changed, but it's still relative.

    To make your CS course worthwhile and strictly usable in your future career you should actively seek sponsorship in a software company and take a split year working for that company during your course.

    CS courses could be better by ensuring that the University administration actively assists students to find sponsorship and provide options for taking a year out during the course.

    Just my petty thoughts...
  • hmmmm.. yes it would be very bad if one company sponsered and entire faculty, but this wasn't what I was trying to say.

    I think work experience is very important. I'd like to see Microsoft, Sun, RedHat, and whatever sponsoring say 5/10/15 places each.

    Students will then learn not just the theory, but it's application.

    University, for me, was a real bore, and didn't prepare me for the real tech world at all. I just don't want to see others wasting their time on stupid things that they keep in the courses. Putting the students out there for a year to actually work accomplishes this - and hey we can't all afford the tuition fees let alone the living expenses these days, taking a year off to earn some hard cash and getting sponsership from a corporate entity is probably required, especially in the UK, possibly more so in the US.
  • Should the scientist be able to build up a prototype circuit or something? Sure. But Engineering isn't about just that; it's about procedure and foolproof methods. Your scientist doesn't build the production line; an engineer does. A scientist can tell you what kind of load-bearing materials and shape to make a bridge, but engineers actually build it... because they deal with the actual physical realities.

    Of course there is overlap. Just as a technician with no degree can do soldering as well as a highly degreed engineer does not make him an engineer.
  • One thing I've noticed is that many universities, CS actually stands for 'Computing Science', so its' the science of computing, not of Computers.
    Building a faster computer is engineering, finding a faster way to factor prime numbers is CS. Finding out new methods of sorting megateragigabit datbases is CS. Building a faster computer is engineering.
  • Physics IS a branch of applied math.
    And he's right. Traditionally, the science of computing is all about math (and still is).

    The problem nowadays is many people are assuming that CS is about Internet, programming, and computer hardware. It's not.
  • "Computer Science* is the study of data structures and algorithims, which is *DEFINATLEY* a science

    No, it is not. Science is the study of natural phenomena, and the use of the scientific method to develop predictive models that describe the behaviour of that phenomena. The study of algorithms is a branch of mathematics.

    And as far as being having to understand programming before understanding algorithms, well I feel sorry for your professors. Algorithms were invented 1000 years before computers. The word algorithm is a corruption of the name of the author of the first book on modern algebra, written in the year 830 by Mohammed ibn-Musa al-Khowarizmi.


    MOVE 'ZIG'.
  • According to this definition, physics, biology, math, and computer science are all sciences.

    1. Please explain to me what "observation, identification, description and experimental investigation" has to do with computer science?

    2. Natural phenomena. 'Nuff said.

    3. Such activities applied to an object of study.

    Bzzzt. None of these apply to the activities described as computer science.

    Mathematics is an extremely important part of the application of the scientific method, however that does NOT make it a science per se. It is merely one of the tools (the most important) used in the pursuit of science.

    On a deeper philisophical level there have been attempts to classify mathematics as a science by asserting by induction that the structures of mathematics (sets, numbers etc.) are real objects through holistic continuity with the real objects of science. However this approach fails when you investigate it further, for the application of maathematics to science is always in an idealized sense, for example in the study of the fluid mechanics of waves you do not treat the individual atoms of water in a mathematical fashion, you make an assumption that you have a continuum of fluid, and you do not take into account the depth of the liquid, you merely assume it is infinitely deep. Therefore there is no essential holistic continuity between the objects of the study and mathematical structures used to build models of their behaviour. These objects are fundamental abstractions.

    Q.E.D.


    MOVE 'ZIG'.
  • What your definition of 'CS' is?

    This first thing is to change that name. Computer Science is an oxymoron. The only science involved with computers is in the making of them. Solid state physics etc.

    CS as it is today should be taught as a branch of applied mathematics, which it really is.


    MOVE 'ZIG'.
  • Because if they're round, no matter how the manhole cover falls onto the opening, it won't fall through and hurt the person standing underneath.

    While this is the conventional answer, it is an incorrect answer. It is quite possible to make a manhole cover of ANY shape that will not fall into a manhole if the manhole diameter is smaller than the smallest axis of the cover.

    The reason that manhole covers are round is because this shape gives them the maximum minor axis size for a given amount of material, i.e. it is the optimal shape from an economic point of view.


    MOVE 'ZIG'.
  • Mathematics is not a branch of science

    My premise.

    Some would say that math IS physics.

    They would be wrong. The study of physics implies the study of some natural phenomena. The study of mathematics does not. Physics does use mathematics intensively in such study, so much so that advances in math are often required to solve physics problems. But it is not true that advances in math always lead to advances in Physics, or that advances in Phyics are always built on advances in math.

    Science is not the explanation of natural phonomenae. It is the pursuit of knowledge.

    Far too encompassing. Memorizing a poem is not science but is a pursuit of knowledge.

    What's over the horizon, why is the sky blue.

    the question 'why is the sky blue' is the study of natural phenomena. What is over the horizon may involve sailing around in a boat. Nobody ever called Magellan a scientist.

    You are confusing science with philosophy. They are NOT the same thing.


    MOVE 'ZIG'.
  • I've been at three different computer science schools (Waterloo [uwaterloo.ca], UWO [csd.uwo.ca], and OGI [ogi.edu]) as an undergrad, grad student, and professor. Some of these schools are great, and some not so great (no comment :-) The teaching quality does vary, but not that much. I've conclucded that the real difference is the quality of the students, which induces a feedback loop.

    What happens is at a great school, you have a strong student body. This lets the faculty run the program at a high level (teach fast, advanced content, etc.). This attracts even stronger students, forming a positive feedback loop.

    At a not so great school, the students are relatively weak. This forces the faculty to teach slowly, remedial content, etc. Students may also be looking for that "quick fix carreer change", which means teaching technology (Java, JDBC, VB) instead of fundamental concepts (algorithms, data structures, abstraction). This in turn attracts more of the weaker students, forming a negative feedback loop.

    So if you're hot stuff, go to a hot school. When the assignments are hard, don't be surprised. If you're more into a slack lifestyle, go to a lesser school.

    Of course, teaching quality does vary. But contrary to what some other posters have said, teaching quality is not the inverse of research quality. Some research-oriented faculty are too busy to spend time on their students, while others are also truly great teachers. At small colleges, some faculty are there because they truly love to teach and are great at it, and some are there because they are lamers and a Moo U appointment is the best faculty job they could get. But my basic observation is that these variations are minor compared to the student body feedback effect.

    ----
    Crispin Cowan, Ph.D
    Research Scientist, WireX Communications, Inc. [wirex.com]
    Immunix: [immunix.org] Security Hardened Linux Distribution
    ----
    Research Assistant Professor [ogi.edu] of Computer Science
    Oregon Graduate Institute [ogi.edu]

  • The thing is, most universities follow a standard curriculum set out the by the ACM [acm.org].

    There are a few versions of this, the most recent completed was Computing Curricula 1991 [acm.org]. There is another version in-progress, Computing Curricula 2001 [computer.org]. I seem to remember being told that most schools are actually somewhere in between the 1991 curriculum and the previous recommendation, which doesn't seem to be online.

    So, the bottom line is that our beef may be with the ACM. This question is well-timed, since the new curriculum is still in development. There are several discussion groups [computer.org] open to the public on the new curriculum proposals.

    Greg

  • Did you learn to drive a car by first studying how a combustible engine works?

    This is the kind of argument people seem to be making when they say that Scheme is not a good first language.

    But even if you think CS graduates must start by learning concepts like poiinters and memory details, again I'd say Scheme is a good first choice. Just think of the Box-and-Pointer representation of cons-cells in Scheme. You have a very simple view of pointers as well as memory usage (by building or destroying lists). Why not start with a simpler view of pointers and machine memory before you move onto the wild land of C-style pointers and memory segmentation?

    Scheme is an OO language as far as I'm concerned - I know it's a functional language but functional vs. procedural is a differnet axis than OO and non OO (whgich are more styles in my mind). In what way do you not consider it OO? Scheme lends itself quite well to OO programming, much moreso than C (which you can use some OO concepts in but it does not lend itself to it).

    As for "Pseduo-code is how we describe algorithms". Why? Pseudo-code came from a sort of abstract representation of many langauges like C and Pascal, in order to create readable code that was free of language specific constructs just there for housekeeping (like memory management or strong typing). As languages advance do you not see room for describing algoritms in some other way? Must we always be trapped in psuedo-code, forcing this mind-numbing and ancient way of describing algorithms to students? It has its place but there are always ways to improve communication and thus understanding!

    Consider "Scheme and the art of programming". Though it speaks of sorting, queues, and other things we can (and do) demonstrate with pseudo-code, it does not once use pseudo-code. It would be near meaningless in scheme to do so - as I said, that is a bad mapping.

    What it does do is use the actual code for a queue is scheme in order to talk about queues, as well as demonstrating Box-and-Pointer diagrams for the same.

    Similarily when speaking of sorting, it uses actual code instead of pseudo-code and then shows examples ot the code in action and various states of lists to demonstrate efficency and operation of various sorting algorithms.

    Part of the power I see in using Scheme as a teaching tool is that the language is truly simple enough to work with that you can make ideas readable enough to understand instead of having to distill them into pseudo-code. You can work with actual code as a base and then expand on that to learn with, much more powerful than focing someone to convert cryptic pseudo-code into whatever language you are using at the moment and then claiming they understand what the pseudo-code really meant. All you have done is sucessfully created a ((psuedo-code)) to ((Language)) robot that does translation for you.

    As far as being a "Toy", Scheme is certainly not going to ever be used widley as a practical language. But I have found many of the concepts I learned working in scheme serve me quite well now working in Java, and before that in C++.

    I also wonder why we should let loose programmers into the world that may understand (partially) about memory management but have no idea why they might use a linked-list as the basis for a queue rather than an array or know what a hashtable is. I'd frankly much rather they understand fundamentially the cost of various common patterns than they fully understand what is happening at the machine level. Understanding of mechanics is so much less important than understand of the architecture of systems and sub-systems that I really fear what will happen if we keep on churning out people who understand how to build a bridge but not how to build one to last or be stable under load.
  • I don't think the problem there is with teaching OO first - I personally think you should be tought both and it's going to be difficult moving from a procedural language like C to an object oriented language like Scheme or Java, no matter which direction you go.

    I think you pointed out the reason yourself why things were not flowing well. Why on earth would you use pseudo-code AT ALL in an OO course? Why wouldn't you use UML diagrams (or somthing like them) to illistrate sets of objects and how they'd work together, or better yet, patterns?

    You can't just switch a class to different language, you have to have the class change to reflect the way you would naturally work in the language to be used. You can bet that in the Scheme classes I took they way to approach problems looked a lot different than my C++ classes!

    I also agree that Java is not the best language to start with though - I really still think the best route is to learn Scheme and then work outward from there. Scheme really lets you worry mostly about the problem, and not so much about the IDE. I also think it teaches useful ways to approach problems. The interactive nature of it also is easier to work with while you're learning.

    Incidentally, your argument points out why I think C++ is really the worst language of all to start with - not only do you have C issues to deal with, but also OO concepts as well. That is too much for any beginner to deal with... I think the path from an OO language like scheme to C++ might be manageable to start to learn the details about what the language is really doing.

  • The language taught should not matter at all. What is important is the concepts presented in a number of different languages and applying those concepts to other languages. If too much emphisis is placed on what is cool and high tech then students suffer because they don't understand concepts when moving onto new or older languages. Cool languages are like the soup de jour. They're only really around for a day or so. After graduation, do you really think you'll be given a course on the new lanuage the boss wants to program in? No, he or she will just say go program in it and it will be up to you to learn it.

    20 years ago the cool languages were Fortran and Cobol. 10 years ago it was C and Pascal, now it's C++ and Java. What will be cool 10 years from now who knows. The important to learn and understand how the machine works and how data, instructions, and devices are handled. Specific language skills are generally worthless in a few years, because you'll either forget them from lack of use. or the language will be obsolete for new stuff and legacy for everything else.

    That being said, teaching in C++ to begin with introduces a number of ideas including object oriented programming, some structural programming, and things that Java doesn't teach such as pointers and memory management. My university, KU [ukans.edu], is moving away from Java and back to C++, because Java hides so many details the new students don't understand how things work.

    Like you stated, learning on your own you gained more knowledge. I agree, but the vast majority of students are only in the classes to get a grade and not to learn. All that matters is the A, and not C++.
  • I think that this has to depend on the attitude of the students involved in the program, and how much they need mentoring in their studies.

    I went to one of those heavily research-oriented schools, and while lectures were quite large (class sizes of like 150-200 people for upper division CS classes), a lot of people still learned a lot. Part of it comes down to your personal style. Are you willing to seek out the professors and ask them questions and talk to them during office hours? While a class of 200 people means that of necessity the professor did NOT go out of his way to get to know any individual person, if you made an effort to speak to them, they definitely remembered you and would have conversations about the work. So in that respect, it's one of those things where if you decided to make the extra effort to get to know your professor, you'd get some of that same one-on-one interaction.

    It's MUCH easier to hide in the crowd at a big, research-oriented university. It's much easier to never make the extra effort to get to know your classmates, or your professors. And for some people, that's great because they WANT to blend in. But at least at my school, those who made the extra effort were always rewarded for it.

    But there was one advantage which I couldn't have gotten at a smaller program which didn't have a research focus, which was the chance to do research itself. I did a year of research in one of the research groups as an undergraduate, and was a full participating member of the projects I worked on. Having that opportunity (as well as the opportunity to take graduate classes as an undergraduate) was huge for me, and probably was the most defining part of my undergraduate education. If I had been at a school with less of a research focus, it never would have been made available to me. Quite a few of my friends did the same thing. Again, you have to go out of your way to do it, but almost all the professors had extra research grant money to spend on undergraduate research (the NSF does lots of HUGE grants JUST for this).

    I'm not saying that this style is for everyone, but being at a research-oriented school doesn't preclude you from having individual one-on-one education, as long as you've got the sort of personality where you'll go out of your way to get that kind of attention.

  • Because if they're round, no matter how the manhole cover falls onto the opening, it won't fall through and hurt the person standing underneath. If there's any points, that implies a cross section bigger than another cross section, so the cover could fall through and hurt/kill somebody underneath if it came down wrong.
  • The more things change...

    Back in '83-'84, this debate was going on at UC Santa Cruz... The students formed "CISSA" (the CIS Student's Assn.) to influence the department. CISSA wanted more applications oriented teaching. The department wanted theory.

    The department was correct, of course, but I think we actually struck a happy medium... We got a few application oriented electives, but the core remained theoretical.

    The big classes were Algorithms/Data Structures, Language series (Parsing/Semantics/Compiler Design), OS Theory (not a specific OS, either!), and Digital Logic. A good base, for the upper division theoretical courses and for Real LifeTM.
  • I am in 4th semester of a Comp Technology course in Ontario. It teaches 1/2 network admin/ 1/2 programming, with a little web design thrown into the mix.

    From my experience the best classes I've had have been with teachers that know what they are talking about, have decent teaching skills, and WANT to be there! Teachers don't have to be great at being professors, but a good attitude, and making the classroom more of a fun collaborative place, and not a "sit striagght and listen to me drone for 2 hrs". The best classes I've had are ones where the class contributes to an example program. This was after the theory was presented, and a practical example was being given.

    Don't base all your marks on tests!. IMHO grades should be more about skill, and less about fact regurgitation. I can code circles around most of my classmates, but I don't do as well on tests because I forget the minute details sometimes (is it focus() or setFocus()? etc). But since test/quizes are often 80% of the final grade, my marks suffer.

    Access to hardware is vitally important. I am in a laptop program, where every student is required to have a laptop, and the contstant access to my development platform is a godsend. Nothing like being able to code up a project on a bus, at home, on 'family trips' etc. This makes learning programming a lot easier. Conversely, the networking && comp repair labs are only accessible 2 hrs a week, and the difference in practical working knowledge among my classmates between the programming and networking sides of the course are remarkable, with programming coming out on top.

    Thats my long-winded 0.02c

    Dave

    DOS is dead, and no one cares...
  • CS programs suffer from the problem alluded to by other posts that they are teaching two things: computer science theory and computer science practice. Sometimes the best way to learn about a concept (such as object-oriented inheritance or parallel algorithms) is to learn a language. Programming is a tool to better understand the theory, not the goal of the curriculum.

    Should graduates be able to program? That's not clear (to me), but they certainly should be able to design computer algorithms to solve problems. Once you can do that, actually writing the code is generally not difficult. The next difficult step is, of course, testing and debugging the code written. However, the theory involved in there is very limited, as it's more engineering than computer science. Style (readability and maintainablility) are similar in flavor. I'm much more concern that people graduate without knowing Dijkstra's algorithm or being to do dynamic programming rather than actually being able to program either.

    If we want people who can program, we need to introduce a course of study to teach them to program. Please don't call it computer science.

  • I really don't know if it really makes that much of a difference it being a BMath or BS in Computer Science. That's why I put the FWIW (for what it's worth), because I think of it more of a subnote than any real point. For me, it was helpful, because I ended up taking a Double in CS and Operations Research and the overlap in curriculum was rath high (I only needed 1 extra course), and I know a lot of others that did minors in Applied or Pure Mathematics.

    One good thing about the required Math curriculum is that first year math classes covered standard proof methodologies (contradiction, induction, etc). These were required in later CS classes. They have dropped Calc 3 from the CS requirement and I can se why (it was basically Calc 2 generalized to n dimensions). The core math that is left (1 classical algebra, where you learn RSA, 2 linear algebra, 2 calc, 1 numerical statistics and one observational statistics), is rather handy to have in upper year CS, although I don't see the need for a whole Faculty just to support a CS Major.

    I guess, in short, if a University wants to offer Applied Math, Combinatorics & Optimization (OR was a specialization of this), Pure Math, and Statistics as Majors, then it makes sense to open a faculty of mathematics. It even makes sense to me to put Computer Science in that Faculty and share core curriculum with the other Majors. But if you are just offering CS, you can achieve the same affect by adding these classes to a Science or Engineering Faculty (which I have little doubt they would already be part of, with the possible exception of Classical Algebra).

  • This really shouldn't matter in the long run. If you are being taught procedural and OO concepts, then you will be able to generalize them to any language. My univ use Pascal in first year for procedural stuff and Modula-3 in third year for OO. Everything I learned in Modula-3 generalized very easily to Java.

    In fact, everything I did in Pascal generalized to Java as well. We were doing ADTs and proving the order of algorithms. This is the stuff I think first year students should be taught because all of that sticks with you for the rest of your schooling. It's hard to take a course on algoithms when you don't understand the reasons why a Binary tree can be slower than a AVL.

  • Just because they don't come to discuss the material doesn't mean they aren't discussing it. I never once went to a TA, because I had my own group of friends to discuss the class concepts with.
  • The analogy still holds. You can do all of the order analysis of a program you want, but ultimately it's the real execution of the program that proves you right or wrong. Physics predicts correctly things that cannot be observed with the technology at the time. Computer Science is no different.
  • I see a lot of people saying that we should start with assembly. I personally don't feel this may not be the best solution. Starting with a higher level (although not a 4GL) language will help students see patterns of language use. Then when they are taught assemble, they can be taught how these patterns are implemented (like how a function call is made in asm). If you start with asm, then you have to teach procedural concepts that are better taught in a procedural language.

    But since I didn't take the asm first route I'll never know. I learned asm after learning BASIC and C and while learning C++.

  • Knowing C is not the same as knowing assembly. Before I knew assembly, I had little idea as to how the stack was used in a function call, but I constantly used function calls in C.
  • It is you who are mistaken about a great many things

    Computer Science is a science, it is *not* a natural science. Yes, there is a difference.

    Second of all, your statement about algorithims while correct, completley missed the mark. You need a language to describe algorithims in regardless wether its 2000ad or 830 bc. In computer science this language is almost always C/C++ or pseudo code ...

    And sure Mohammed ibn-Musa al-Khowarizmi might have come up with some algorithims 2800 years ago, but guess what? Most modern algorithims are designed with rates of millions of operations per second in mind ... If you don't believe me try decoding an mp3 by hand, thus the necessity of computers.

  • *Computer Science* is the study of data structures and algorithims, which is *DEFINATLEY* a science ... I alwways hated my algorithims courses, no programming, just calculus to prove how fast they were! A famous computer scientist [who's name escapes me] once said "About the only interesting thing you can do with programs is prove they are correct." A good CS curiculium will walk you thru things like:

    Data Structures, Algorithims, Automata, Langauge Theroy, Performance Evaluation, and compiler construction, which are the meats of computer science ...

    However before you can understand any of these advanced topics you need a good base in a few programming languages -- which is where I expect some universities get caught up.

  • I'm about to graduate from a 4-year program at CMU, and then go on to grad school.

    Here's what I'd change:

    - Emphasize teaching and TAing; heavily encourage students to TA, and provide training and resources for them and their professors. Since the numbers don't really work if every student is TAing, organize some classes so that they are like teaching (presentations and such). I definitely learned more from TAing 6 (jeez) classes than I did taking them.

    - A mentorship/advisor program for senior year, similar to how it is typically done in graduate school.

    - De-emphasize low-level programming for beginners. In computer science theory we only worry about algorithmic efficiency, and in industry computers are getting so fast that programmer productivity and the quality of a product matters more than its runtime speed. Of course, systems programming classes are vital to a well-rounded CS program, just that you shouldn't make beginners have to deal with bus errors and core dumps.

    - Objects are not everything! I would have loved a course which gave us design problems and asked us to approach them in lots of different ways (objects being one of those, but not the only one).

    - "Tracks" for students who want to be industry programmers vs. theory heads vs. etc.
  • CS != Programming. Thus universities should not be teaching the "latest and greatest" programming languages (java, perl, PHP, etc.).

    Instead, university CS programs should start at the beginning. Teach assembly language. Use assembly language for everything, at least for the first two years.

    Why? Because in order to understand the tradeoffs involved in, say, using a hash table instead of a linked list, you have to implement them. If you have a high level language which provides hash tables for you, you'll go away with a biassed view of data structures, because you won't understand that the complexity of hash tables is usually undesirable.

    Only once students have a firm grounding in the basic concepts of data structures and algorithms should they be introduced to high level languages.
  • Hey, as long as your summers are your own, you always have the option of taking a summer internship. While here at Drexel they have an entire staff that will hold your hand through the entire process and a coop-only job board that attracts hundreds of local companies, you *can* DIY.

    The idea is surprisingly easy to pitch to companies, as it makes good business sense. Basically you say to whoever is in charge of Hiring, "Hey, here's my resume. Everything on here is something you can have me do for 1/3 the cost of having a salaried engineer do instead. I require little handholding, learn quickly, play well with others, etc. Plus once I graduate you'll have someone you already know and trust ready to come on full-time. Whaddya say?"

    Do this 10 or 15 times and you're sure to get a couple bites. Be persistant! Even moreso that in a normal job market, you are really becoming a salesman whose product is himself (and a very marketable idea).

    so good luck, and feel free to email me if you need any help.

    p.s. fundamentals first is the way to go. trade school is for weenies ;-)

  • I totally agree. It's just a matter of perspective. There are several other factors influencing the perspective as well as the ones you list:

    • Once you learn X, other people seem dumber when they talk about X. It is much easier to see other people's flawed thinking when you know more about the subject.
    • As a graduate student, the classes you took most recently have much more intelligent students than the ones you are teaching. This is true at any level. To a 9th grader, a 5th grader's attempt at solving an algebra problem will look naive and stupid. As a teacher, you must learn to step in other people's shoes. Remember that you did not obtain the knowledge you have overnight.
    • As a good student, you probably hung out with other good students. When you are forced to interact with the class en'masse, you have to see the good and bad students.
    • Going with the parent poster's point. You probably never got too much info about how good other people's work was in classes you were taking. If you got too much detail, after all, it would be cheating. Without having this information, you probably assumed that you were of average ability. Of course, you were not.
    Cheers,
    Brandon.
  • Those who took the class with me are as a whole not afraid of pointers. They understand the difference between a memory location and the contents of that memory. When they get to it, they have little trouble adapting to higher level languages, because they understand how those languages work.

    Those who took it later are uncomfortable with pointers. They might not even know what one is-- even enough to explain pass by reference. They have a difficult time moving from C++ or Java to something lower-level, such as C.

    So here's my question: What is the relative size of these two groups? Are they roughly the same? Or is the second group (the one that generally has trouble with pointers) larger?

    Could it be the case that the difficulty of dealing with pointers and assembly language simply acted to filter out those who didn't have certain talents needed to deal with those things. Which would suggest that the earlier curriculum didn't do a better job of teaching pointers, but really just drove people who could have been effective programmers (except when dealing with certain small and now-a-days rarely-used aspects of programming) out of the field.

    For the record: I started with BASIC, and then Pascal in High School courses. After Pascal, I taught myself C and X86 assembly; I don't have any trouble dealing with pointers. But, in my day-to-day work, I almost never find myself needing that ability.

  • What, are you suggesting that pointers are a rarely used aspect of programming? Pshaw.

    Simply saying 'Pshaw' hardly qualifies as a compelling argument. Pointers are only really useful when you're doing hardcore systems programming - which is a small minority of the programming that gets done.

  • The classic answer is that it's not possible to drop a round manhole cover through the manhole ring. Square or rectangular manhole covers can fall through the matching hole.

    Actually, I suspect it has more to do with how manholes and covers were made around 1900. Around then, the available machine tools were drills, planers, and lathes. Metal objects were made by casting a rough shape, then finish-machining. You could make any rough shape, but finished surfaces either had to be round (drilled or turned) or flat outside surfaces (planed). Flat inside surfaces, like a square hole, were very tough to make. If you look at a turn-of-the-century steam locomotive, you'll see this clearly; all finished surfaces on large parts are either outside flats or rounds.

    Microsoft used to ask this question in employment interviews, which is why it's well-known.

  • If you can't do object oriented programming in plain, vanilla C, then you don't understand object oriented programming.

    Period.
  • So what you're saying is that coders should write everything from scratch. Have you written your own X server, or graphics toolkit. I doubt it. Java programming is a hell of a lot more than putting together a GUI. Or does the JDK contain classes for every business process for, say, every public utility on the planet, including future ones not yet thought of. OOP requires as much thought and knowledge as procedural unless you think you could learn Java and become productive in a few days.
    By the way, Python is OO, so it's a crap example and it also has Jython **gasp** a Java-enabled version.
  • So what you're saying is that only people who can code in C or C++ are programmers. Well I have the misfortune to be a COBOL contractor and, guess what? I get paid $100,000 a year. Not bad for someone who doesn't know the 2 true languages, C and C++. Of course I also know a few other languages, such as Java, Rexx and a bit of Python, but they're not 3l33t and therefore irrelevant. I've got a question for you Mister überprogrammer: what's the point of having high-level languages if you have to do the memory and pointer management yourself? Why not just use assembler? Granted it makes device driver writing easier, but that's a pretty specialised skill compared to the requirements of your average bank or supermarket.
  • Of course I can debug a core dump. It'd be really tedious to have to go through the program putting displays everywhere. I don't need an understanding of assembler to do this, just knowledge of where my storage is.
  • I don't much like COBOL. It's way too wordy. It's my job though and it pays for me to play with better stuff (and the rest of my life too). I agree with you that a C/C++ programmer would have an easier job learning COBOL than I would (although they'd probably go insane first), but it doesn't make me any less of a programmer. The reason I haven't bothered to learn C yet is that it's a bit obscure and requires me to allocate my own memory and mess around with pointers which, although more efficient, is a pain in the backside. High level languages for me are supposed to allow the programmer to write code without having to worry about allocating and freeing space and cleaning up after myself. The system I'm working on at the moment is a shambles and would be a lot worse in C/C++ due to having to do more low-level stuff.
    Finally I'd like to apologise if I was a bit ranty, but you seemed to be saying that only C/C++ programmers were the only ones who truly knew how to program. And salary is probably the worst indicator of someone's ability. I've worked with people on way more than me who should have been wearing bright orange wigs, baggy trousers and big floppy shoes :)
  • The correct title of the series is The Art of Computer Programming, although Volume I is named Fundamental Algorithms. You can read the details about it, and other forthcoming volumes, on this part [stanford.edu] of Knuth's home page.

    As for MIX, the new editions will continue to use something similar to it, an assembly language for a machine called MMIX. Knuth explains [stanford.edu] why he continues to use a low-level language on his home page also. There are several reasons, and I can't do justice to them by trying to summarizing them here.

    Funny side note: MMIX will have an operating system, NNIX. But the system is open, so

    Other alternatives are also possible; for example, somebody at the Free Software Foundation might decide to come up with an alternative system called GNNIX.

    Knuth's way with bad puns is one of his endearing qualities....



    --
  • There's some key stuff that, if you have not learned--and I mean learned, been steeped in, experienced, breathed--then no matter what you have learned, you don't know shit. Get the following things added to your CS curriculum some way or other:

    • exception handling or signalling. Yeah, yeah, I know, you learn this. But don't learn it as what you do for an "error" or a rare exceptional condition. Learn it as what you do for any case where you really wish to return a value of a different type, or simply to break out of a loop. For example, any "search" method should return a reference to anything it finds, but should "signal" not found. Then you can "search" and process the answer without a pesky "if". It's subtle. It changes the way you work, but what it really does is teach you to strictly obey type correctness. You'll write more code that can be reused because when you plug it in somewhere, it brings its own context with it.
    • tail recursion. No, not the compiler optimization that makes it possible. Instead, the application programmer's semantic that's been called "the ultimate goto". It will erase the distinction in your mind between blocks and functions and make you realize that languages like C++ put some real shackles on the things you'd like to do.
    • closures. When you enter a sub-function, the variables get pushed onto a stack frame, right? Closures offer the ability to hang onto a stack frame explicitly. The way you can then preserve state achieves a break with the way you used to program that's very similar to the OOP paradigm, except it's runtime "data" inheritance rather than compile time "type" inheritance.

    sorry, there're more, but I've got to head out the door. These things have been part of the MIT freshman CS curriculum for upwards of 25 years, and they'll be lessons you never forget.

  • Any suggestins on good OO basics and theory books?

    You could try Craig Larman's "Applying UML and Patterns." I'll be using it to teach a third-year O-O development course, and the book has gotten generally good feedback from people.

  • Disclaimer: I'm a grad student and teach at UVic.

    the nature of C/Pascal being close to assembler helps people to fundamentally understand what the machine is doing

    Why do you think it's a good idea to start with teaching fundamental machine mechanics? While I agree that, to be a computer scientist, you do eventually have to learn the nitty-gritty details under the abstractions, I really think that's a bad place to start. The students will just get confused with the details and drop out before getting any clue as to how they might fit together. Understanding low-level computer functionality is intrinsically difficult; that's why we've come up with more abstract layers to make computers more approachable.

    from our experience at U. of Victoria teaching java as a first language does harm because memory allocation, pointers, etc. are not taught. Even though most languages in heavy use today - java, perl, python, etc. all have garbage-collection, these concepts are KEY, and should be taught in 1st year.

    Memory allocation is taught in Java. Or at least, I've tried to teach it -- but most students never understood that declaring a reference variable is a completely separate thing from allocating the memory for an object. I spent a fair amount of time on this, and still saw code like:
    Thing t = new Thing();
    t = someOtherExistingThing;

    As for garbage collection, unless you're doing OS-level work, you might as well assume it's there. No high level language today would dare leave it out.

    B. Sc. Computer Science Algorithms, language design, computability, structures of data, efficiency. Software Engineering Handling obscene amounts of code and making it managable Programming Diploma VB

    Part of the problem (at least at UVic), is that our first year courses are "targeted" at all of the above people. This leads to a number of compromises that nobody is really happy with, but the money and teachers aren't there to split up the courses into separate streams. So while CS majors might benefit from an introduction to pointers and memory management in first year, the other two streams couldn't care less...

    There are also, of course, the classical problems of people coming into the program with vastly varying amounts of experience, but these have been discussed to death before. So while I'm not entirely happy with UVic's CS and SENG programs, I don't think teaching low-level concepts in first year is the answer.

  • The following is based on a recent conversation with a friend (yes I actually have one of those)

    There are advantadges to knowing the older stye linear programming, compared to the more modern object oriented. Just to be perfectly clear in advance: I think people should know both.

    Part of the magic of object oriented stuff is that you do not have to get into the lower levels to tweak with things. But, lack of expertise in this regard leads to bloat, lack of efficiency, and another programming layer where flaws can crop up.

    an extreme example of the opposite practice are the products at Gibson Research Corp. [grc.com]. [Disclaimer: I am not an employee, and have never been associated with him or his company, aside from just being damn happy with his products] Steve Gibson programs all of his stuff in assembly language. They are damn tight, damn fast, and damn small. If windows were programmed this way, it probably wouldn't be quite so horrid.

    Point being, each programming approach brings with it certain strengths, and certain weaknesses. Being able to undercut the weaknesses with bits of magic from a lower level elements makes life easier.

    For example, the vast majority of books on things like javascript all cover the basics, but almost everyone each leaves out one or two critical elements needed to get the big picture. Never mind various differences (when you want to do something fancy) and you want to have an integrated page serving all browsers. Never mind that every time your turn around you have another point version of the language to catch up on, and sort out regarding the various levels of compatibility (even though people are trying to push standards).

    This last bit becomes the bouncing ball of marketing. Use this language, use that, with this additional layer of muck on it. But is this even an inprovement? And do we want to be teching this in college, etc. without teaching the underlying fundamentals really well?

  • The correct title of the series is The Art of Computer Programming, although Volume I is named Fundamental Algorithms.

    Yup. Serves me right for relying on my memory for something I read over 20 years ago! :/

    Thanks, too, for the links! Especially for his explanation of why he chose machine language then, and now.

    The single biggest impediment to following his examples was that I could not try them out directly. I learn best by DOING. Trying to follow what was provided is one thing. But I seem to learn far more when I can readily experiment and SEE what happens when I try changing different things. I sure hope someone DOES implement an MMIX / NNIX system. Heck, just an emulator that ran on Linux would be a big step in the right direction!

  • Camp Fear [camp-fear.fr.fm]

    Okay, it's a tactics guide to Counter-Strike, but it's a pretty good one. It discusses strategy in Counter-Strike (yes, there is strategy in Counter-Strike, because CS isn't deathmatch, just like GNU isn't UNIX and LAME isn't an MP3 encoder). Tactics for individual maps, as well as general strategies for each type of map, are discussed in depth. I'd recommend going there if you play CS, or want to castigate the lamers on your CS server.

  • In reference to your position that Calc is not needed, where are you coming from?????
    W/o calc, discrete math, linear algebra, and basic algebra, how do you purpose to write code with any sort of tolerable time complexity? These higher math courses give you a HUGE set of POWERFUL tools to talk about and manipulate elements in your field.
    Already in my undergraduate career I have found uses for vector analysis and Calculus that, with out, I would have some ungly code doing needless calculations on the fly! Specifically calculus helps you reduce complex computations into easy smaller forms!
    I can understand that position of some that the math requirements are annoying, but they are VERY worth it if we pupose to educate Computer Scientists and not a bunch of very savy computer users.

    Sam
  • I have to agree. Object oriented programming just doesn't make a whole lot of sense unless you understand what's going on behind it. Teaching people about abstract data types in C is simple. Showing people how you can implement them using data structures and function calls is simple. Making the final step and showing them that an object oriented language is nothing more than a prettier way to do this is just the icing on the cake.

    The problem is, when people learn about objects from the start, some of them actually tend to think too literally. Often times, they will believe that each object is like an independent little program, not that it's just a data structure with some functions for accessing it. These are things you have to learn about from the ground up, without chancing the misunderstandings you'll run into going the other direction.

    I must also point out the disadvantage of using Java as a first language, particularly because issues like Memory Management are so completely buried. That's great if you believe your students are never going to have to use a lower-level language in their careers, but when and if they do, they're forced to learn concepts they should have known far earlier. Concepts upon which a much deeper understanding of computer architecture rests.

  • I attended a small business focused school near Boston about 10 years ago and took a number of their CIS classes (algorithms, data structures, structured analysis, etc).

    I feel that one of the best tactics used by the professor was to create a project assignment that lasted the entire semester. We would start out with by writing a basic application framework (say an online catalog) and extend or modify its modules (replacing the data storage mechanism for instance). Students who didn't abstract their code enough via implementing flexible interfaces would end up spending inordinate amounts of time rewriting their applications to accommodate the requirements of each new project unit.

    The project forced us to live with a body of code for an extended period of time and made us realize that shortcuts or sloppy coding practices would eventually hurt us. We were all forced to think about our code as an evolving work that we would be reusing and extending. For those who embraced the idea (and in doing so succeeded) it was great training for becoming professional programmers.

    Now, after 10 years in the software industry, I've become a v.p. of software development and spend a lot of time reviewing candidates. I always ask the recent college graduates about the structure of their programming classes. The vast majority has only experienced one off assignments, where they write code, submit it for grading and never deal with it again. In many cases, it shows in the quality of their work. It takes a lot of time to retrain these folks so that they write code that not only solves the problem at hand, but is open and clean enough so that it can adapt (or be adapted) to the ever changing requirements of our customers.

  • When really they should be IT majors because they don't know the first thing about coding. They are interested in networking and web page making

    Maybe you should take a web page class or two. Learn some tags.

    The second problem is the fact that they require no previous coding experience to join the CS program. The art programs at my school require you to show them a portfolio. CS does not. So the CS classes are half coders like me and half people who never coded before.

    Do they require med students to provide a portfolio of people they did surgery on? Do they require buisness majors to show a portfolio of companies they have managed? Do they require computer engineers to show all the hardware they have designed? NO. why? Because college is where you LEARN to do this. Art is differnt. Art is a talent that you have or don't (I know some will argue). CS is a skill that is LEARNED and DEVELOPED over time.

    So the CS classes are half coders like me and half people who never coded before. The people like me sleep and skip class and get A's. The rest go to class and study hard and get F's

    Well aren't you so smart. I never coded before I got to U of I (probably one of the most competative engineering colleges out there) and I get A's.

    The last thing I would like to do to improve CS class is make it all lab. Currently I have 3 1 hour lectures and 1 2 hour lab every week. I want to have 3 2 hours labs a week. I learn nothing from someone teaching me how to code. I don't learn the week's lessons (if we learn something I don't already know) until I get to lab

    I don't really know what they teach you in lecture, but at U of I concepts and theory is taught in lecture. I don't think you can learn that stuff in lab. The way an algorithm or data structure works looks quite different in code than it does on the black board. I personally cannot see how coding a hash table will make someone suddenly understand how it works inside and out without seeing it on paper first (maybe thats me). And I have no idea how to teach theory in lab. Go write a program that does the greedy algorithm. If you don't understand how it works, how can you code it?

    -Mark
    --

  • by Nagash ( 6945 ) on Saturday March 17, 2001 @07:47AM (#358142)
    You are incredibly correct and took the words right out of my mouth. I am about to go into the Masters program at UWO and during my four years of undergrad, I've seen the exact situation you describe.

    I am currently a TA for the 3rd year OS courses. Nobody, and I mean nobody comes to discuss the material until
    • There is an assignment/exam coming up in the next 3 or less days
    • They need to complain about their mark

    This sort of thing only increases the apathy felt by both the students and profs towards each other. The reason for this can mostly be attributed to the mass influx of people taking CS over the last few years. Whenever a huge amount of people start taking something that is "in demand" in the world, you get many people who just want to coast through and get that piece of paper. They have no interest in learning - they just want the degree to go make some money.

    Woz
  • by wunderhorn1 ( 114559 ) on Saturday March 17, 2001 @07:16AM (#358143)
    Here are Drexel [drexel.edu] (and I think RIT, too) we have a cooperative education program, or coop. Basically the students start out their freshman year with 3 quarters of classes (fall winter spring), take the summer off, then coming back their sophomore year they work at a real job for two quarters (six months) then go to school for two quarters for their sohphmore, pre-junior, and junior years. Then for their senior year they have three terms of classes again.

    I find this is a great way to learn. There's so much more to CS than what comes from a book or a tenured academic. It also can expose you to fields you never thought about before; for example, I took a coop that wound me up writing code [dval.com] for the huge servers that control your cable TV.

    Being at a school where everyone is supposed to be doing the coop program makes it easy because the classes are scheduled around different majors' coop cycles and you don't have to wrry about your course sequences getting messed up because you were out working when the one class you needed was being offered.
    But since most universities haven't caught on to this, I would suggest going out and trying to get a summer internship.
    There's simply no substitute for real life experience.

    Now, as for the academic side, I would give students the opportunity to wrk at their own pace whenever possible. Let them test out of classes if they're learning fine on their own. My school starts students our with a term of HTML, javascript, and "computing fundamentals", which was a complete waste of time for me. But they forced EVERYONE to take it. Luckily the next term was intro to C++ and you had the option of placing out of it and into Data Structures. Yummy stacks, tasty queues, delicious doubly-linked lists!.

  • by Animats ( 122034 ) on Saturday March 17, 2001 @07:38AM (#358144) Homepage
    I went through Stanford CS for a Masters in the mid-80s. This was back when the expert systems crowd was talking like they were going to rule the world (even though they knew better) and CS was still under Arts and Sciences.

    It was all theory. I once embarassed the faculty by pointing out publicly that it was possible to graduate with an AI specialization without ever seeing an AI program run, let alone writing one. Programming instruction was basically "here's the manual". I took a comparative programming languages course, which covered everything from COBOL to Prolog. It didn't actually involve writing and running programs. I got a good grounding in formal methods, though.

    I did most of my programming on a Xerox Alto, chosen because few wanted to learn Mesa and use the obsolete things, so they were always available. (The Alto is the original GUI machine from Xerox PARC. Page-sized monochrome screen, mouse, Ethernet, file servers, laser printers, all built with 1970s technology.) Most students used the time-shared DEC-20 machines, accessed through dumb terminals. PCs really hadn't made an impact yet. There were a few original Macs around (no hard drives yet), and they were fun, but curiosities.

    At times it was wierd. Exam questions like "does a rock have intentions?" (The classic "Why are manhole covers round" was from the 1974 Stanford Comprehensive Examination on Computer Science). I took "Dr. John's Mystery Hour" (McCarthy's AI seminar), and had good arguments. Met all the big names in '80s AI. But there was an "emperor has no clothes" feeling about the whole thing, because expert systems just didn't do much, and the whole field was stuck.

    About a year after I finished, Stanford put CS under the engineering department. The expert systems crowd was pushed off to the side, and the curriculum became much more practical.

  • by martyb ( 196687 ) on Saturday March 17, 2001 @07:22AM (#358145)

    The best aspects of my school's CS department had formerly been its focus on the fundamentals of CS ...

    Agreed! The languages I learned were helpful, yes, but it's what I learned how to do in those languages that ended up being far more important for me.

    Fundamental classes like "Data Structures", "Algorithms" (both the coding of them and being able to assess their performance), and "Assembly" gave me key insights into the CONCEPTS of what I could bring to bear on any given problem.

    Macros, subroutines, functions, pointers, queues and stacks and lists, are a means to an end. My success in programming has depended not so much on what FACTS I'd learned (e.g. arcane language syntax) but on how I could organize my thoughts and conceive appropriate levels of abstractions of the problem space.

    Slightly OT, but I just remembered Donald Knuth's series of books on the Fundamentals of Computer Programming. There were some truly great things in those books, but I was forever getting distracted by his "MIX" programming language. (And I was reading it well before I could even conceive of writing my own interpreter.) My hangup was that I couldn't type in a program and PLAY WITH IT, to see what it REALLY did. Would that he had written his code in C or some other mainstream language! Do any /.'ers know if Knuth (or anyone else) ever translated his examples into another programming language?

  • by vincea ( 219507 ) on Saturday March 17, 2001 @09:18AM (#358146)
    One are where I think CS schools are really short-changing students is testing and debugging. While there are quite a few schools who educate their students well in design, coding, algorithm analysis, and math, I've yet to even hear of a CS curriculum that prepares any student in the areas of testing and debugging.

    This is a shame because as we all know, a large portion of any software engineer's time is spent testing and debugging code.

    I'm not talking about teaching how to use the specific features of debugger XYZ. I'm talking about the basic sleuthing skills needed to track down defects in your software. How to use basic concepts such as breakpoints, single-stepping through code, data watches, etc. - things that are common to all debuggers. How to systematically use deduction to narrow down where the defect is ocurring. How to do unit testing, and why you should do it. General approaches for dealing with bugs in multithreaded/multiprocessed programs. Learning to write code that supports your debugging efforts and going beyond just using printf() for tracking down defects.

    Most CS schools tend to just throw students to the wolves when it comes to testing and debugging - if they teach anything, they may teach how to use a specific debugger's interface. They usually just assume the students will learn various debugging techniques on their own. Some of them do. A lot of them don't.

    The ones who don't learn a large array of debugging techniques, when to use them, and why the should be used enter the workforce unprepared. They end up learning on the job, which can cause a lot of stress for them and their employers.

    This is pure speculation, but this may be a factor leading to the high number of defects in software programs today. A lot of ink-fresh-on-their-degrees CS students just don't know how to test and debug their software! While I don't think its the only factor, it's something that is rarely discussed.

  • by SecGuy ( 75963 ) on Saturday March 17, 2001 @07:01AM (#358147)
    I took undergraduate classes from CS departments at two different universities over a period of twelve years (1980-92). (I kept getting jobs and stuff, okay? :-) I've also hired and managed dozens of programmers since then.

    First I'll address this question as a student, then as an employer.

    Unviersity "A" had a much lower ranking, and did not emphasize research. University "B" was much larger, much better funded, and was rated in the top ten state university CS programs.

    I think that undergraduate programs in research-intensive fields such as computer science suffer tremendously when the priority is on research and not on teaching. The poster talks about approachability: this is WHY YOU ARE AT A UNIVERSITY, is so you can interact 1:1 with people who have made studying CS their lives. Spending time with someone whose research is a notch less exciting is infinitely preferable to spending NO TIME with someone whose fascinating research you might as well have just read a book about.

    No big surprise, I learned a lot more stuff, both useful and interesting, at University "A" than at University "B". It was also a much more satisfying experience.

    I wish that I hadn't been such a slacker in High School, as I hear great things about some of the very, very top programs, that combine the best of both worlds.

    So: hire faculty who actually want to teach undergraduates, who will spend 1:1 (or 1:N for small values of N) time with them, and who are good communicators.

    As an employer, I really like the increased attention in a number of universities to group projects and methodology. Not that what they teach is necessarily any good (same for theory & algorithms) but because it at least provides some underpinnings and may persuade students that thinking about these things is important.

    It's extremely helpful to someone I'm interviewing if, upon graduation, they can sit down and demo a significant piece of software that they worked on with a team.

    So: make sure your CS graduates have at least some meaningful group-based project where they create something they can show peopple.

  • by LordNimon ( 85072 ) on Saturday March 17, 2001 @07:04AM (#358148)
    I received my MS degree at the same university where I got my BS degree (both in Computer Science). While a grad student, I tought undergraduates in exchange for free tuition. I can tell you one thing: I learned a whole lot more than the students did.

    Granted, this is just one school, but I think it's something you'll find in most universities. My students just did NOT want to learn the material. They would skip class and not visit me during office hours, and then complain that they didn't get enough help for the projects (I'm serious!). None of them showed any natural aptitude or curiosity whatsoever.

    I was one of those students who generally knew more than the teaching assistants, since I had already been programming for ten years by the time I was a freshman (I miss my TRS-80 :-)). But when it was my turn to teach, I was amazed at the laziness and ineptitude I saw. I know my classmates back in college were never this bad, so I can only assume one of two things:

    • I'm a really, really bad teacher and somehow I convinced the students they'd be better off not doing any work.
    • My students really were less competent.
    Reason #2 makes a lot more sense to me.

    So before you go trying to improve the CS curriculum, take a look and see if your classmates are worth it. You might be better off finding a good professor and volunteering to work with him on projects.
    --

  • by Nagash ( 6945 ) on Saturday March 17, 2001 @07:30AM (#358149)
    Someone else has mentioned it already, but I will reiterate: Java does not make a good language to start learning with in a Computer Science program.

    I am less than a month away from finishing a four year Honours Computer Science program at the University of Western Ontario [csd.uwo.ca]. When I started in 1997, we were taught Pascal for the first year course(s) and C for the second year, with C++, Java and others (including Scheme for a course) in third and fourth year, depending on the courses you took. After my first year, they switched to teaching Java off the bat and leaving C out of it (even for second year).

    The first thing that profs noticed was how much more difficult it was to get students to understand basic algorithms and simple CompSci concepts (trees, linked lists, etc). The OO stuff just got in the way. Everyone was hung up on trying to figure out how to simple things because of the object stuff like members, function declaration, etc. was just confusing to wade through for students not familiar with the concepts. What ended up happening was profs had to start explaining OO in some minor way, which only confused them further.

    Now, some of this can be attributed to growing pains associated with changing a program curriculum. However, I am also currently a teaching assistant (TA) for the third-year operating systems course and the caliber of student I am seeing from a programming perspective is quite lacklustre. The OS course is required, so there are a lot of people taking it and I get to mark about a third of the assignments handed in. I will tell you now that I am, generally, not impressed with what I see. The students are beating themselves up trying to grasp the concepts of C when coming from a world of Java (recall, these third-year students never had C) since Java hides so much from them.

    What I'm getting at is that a good language to start with is not OO. Why? Because the concept of OO makes much more sense after you've been introduced to iterative/functional programming. The natural way to solve a problem is not with objects and obfuscation (i.e., good design), but to work out a solution and step through it a la iterative/functional programming. (more on this in a sec)

    You want to avoid Java because it hides too much. Java is good after you have an understanding of what is happening in the machine. In fact, Java is a nice language after the fundamentals are known to you.

    Now, about the above point of iterative/functional making more sense, naturally. Firstly, think about how you tend to solve problems: come up with an idea, walk through the steps to get it to work. Note that there is nothing about objects in there. A pseudo-code algorithm translates a lot easier to a C program than it does a Java program. Pseudo-code is what a lot of first/second year students see and it should not be a chore to implement it. Functional languages are natural for some (like me) and follow the principle of "when I'm done with this, I'll give it to you", which is also not too tough to grasp once introduced to it.

    Bear in mind - I am not anti-OO. I just think it's a more advanced concept that should be saved for later. I design everything around the OO philosophy now in my programs (no matter what language). It is good. It is just not for students starting out.
  • by mindstrm ( 20013 ) on Saturday March 17, 2001 @06:33AM (#358150)
    What your definition of 'CS' is? I ask this in all seriousness. Computing science/Computer Science varies greatly. In some schools, the program is about programming and technical details, in others, it's more about algorithms and theory. One is about science, the other is more about engineering. Which is it you want?

    Sysadmin/Web design/administrative programming or banging out simple apps is not computing science...
  • by Daniel Dvorkin ( 106857 ) on Saturday March 17, 2001 @06:48AM (#358151) Homepage Journal
    This presupposes that object-oriented is the be-all and end-all of programming. Which it's not.

    My college (which has a CS program I'm pretty happy with, all in all) teaches all the first- and second-year courses in Java now. When I went through, it was C++, which really meant "C with a bit of object stuff thrown in." Now, I've noticed that a lot of the newer students do have a kind of gut-level grasp of OO concepts and practices that I had to struggle to attain ...

    ... but the fact is that the very first "real program" I wrote at my job was in C. Not C++, not Object C, damn sure not Java: straight C. And if I'd only been trained in Java at that point, I'd have been fucked. I've been in classes with people -- smart people, well-trained people -- who spend most of the semester trying to figure out what a pointer is, because Java hides those kinds of details; or it drives them nuts that they have to write this thing called "main" that's not in a class; or the concept of memory allocation and cleanup to prevent memory leaks is completely foreign. So they get to the junior level in CS and they think they know how to program, and they _do_, but there's a lot missing.

    I like Java. I think it's a great language and I hope it continues to be used more widely. But neither Java itself nor OO programming in general is What Computer Science Is All About. There needs to be a mix of the high-level and low-level stuff. Evangelists for one or the other really annoy me.
  • by dachshund ( 300733 ) on Saturday March 17, 2001 @06:49AM (#358152)
    I went to a small college that was busy trying to 'improve' their CS curriculum. Among other things, their particular approach was to put most of the introductory classes (first two years) onto the web, and switch all coding from C into Java, with an emphasis on newer technology. They even got a major grant from the government to try this 'experiment'.

    I can only describe the result as an unmitigated disaster. The best aspects of my school's CS department had formerly been its focus on the fundamentals of CS; lacking a huge, monied department, we were taught how things work rather than how the latest database software interfaced (and this had produced generations of very successful graduates.) This curriculum had evolved over a period of years; naturally, there were newer classes, and we weren't still coding in Fortran, but the curriculum had been tested and slowly improved over time. The spate of sudden 'improvements' wrought by these changes have irrevocably changed the nature of our department. Aside from the fact that we have a generation of majors who don't know what a pointer is, much more effort is expended teaching the latest software or OS, to the detriment of the fundamental cross-language, cross-OS education that I enjoyed. So the point of all of this is that sudden, untested changes in a CS cirriculum can cause as much harm as good.

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...