Which Math For Programmers? 466
An anonymous reader writes "It is no news that the greatest computer scientists and programmers are/were mathematicians. As a kid 'hacking' if-else programs, I was not aware of the importance of math in programming, but few years later, when I read Engines of Logic by Martin Davis I started becoming increasingly more convinced of this. Unfortunately, math doesn't return my love, and prefers me to struggle with it. Now, as the end of the semester approaches, I am faced with a dilemma: What math subject to choose next? I have two choices: 'Discreet structures with graph theory' (discrete math; proofs, sets, algorithms and graphs) on one side, and 'Selected math chapters' (math analysis; vectors, euclidean space, differentials) on the other. I'm scared of the second one because it's said to be harder. But contrary to my own opinion, one assistant told me that it would be more useful for a programmer compared to the first subject. Then again, he's not a programmer. That's why I turn to you for help, fellow slashdotters — any advice?"
That depends on you... (Score:5, Interesting)
If you're just worried about the programming (coding and maybe some design) side of things, then the math you need is going to be the math that applies to what you're coding (calculus for physics engines, algebra for accounting packages, statistics for reporting ,etc...).
On the other hand, if you think it will benefit you to know more about what underlies the code (it does me, but we may think in different ways), then I would say absolutely that you should take the Discrete. Computer Science is 95% applied Discrete Mathematics. Computer Science is also a lot of theory which, truth be told, tends to be very specialized in usage to developers unless they're going to the very low levels. After taking DM for my degree, I found that my code has improved, but I also admit that it is anecdotal.
Re:The Second, If Not Both (Score:5, Interesting)
Discrete, NA, then PDE (Score:5, Interesting)
Re:Discrete structures (Score:2, Interesting)
FWIW, I used to write risk-analysis software for the options trading industry. We used a lot of calculus and differential equations in computing the theoretical pricing and risk factors of derivatives. However, before that I did about 20 years developing real-time manufacturing systems software and knowledge of discrete structures, formal logic, and proofs (to detect race conditions in complex systems) was most useful there.
Agebra... (Score:5, Interesting)
Proofs, proofs, then more proofs.
Programming is all about isolating the smallest part of a problem and simplifying it out. Doing proofs is effectively the basis for programming.
Understanding trig and calc is handy for specific projects, but for every single program we write we have to be able to see the problem, to isolate components of the problem, and to simplify them.
-Rick
He's an idiot. (Score:4, Interesting)
Depends on the Problem (Score:3, Interesting)
It ultimately depends on the kinds of problems you're going to end up working on. Any sort of graphics programming is going to require a solid understanding of geometry. Designing games requires probability/statistics, where the actual math could often be understood by a bright junior high student, but gets combined in complicated ways.
Calculus is overrated for anyone not going into Physics or Engineering. I wish schools would put more emphasis on statistics instead, since that's useful for anyone who picks up a news report and sees that there's a 2% spread of support for a pair of political candidates.
More importantly than any of that, IMHO, is being able to see how the program fits together on an abstract level. This can be described as a form of math, but it's well outside of what most people think of as math. Which is fine, because what most people think of as math has nothing to do with what mathematicians do all day. Just the same, it's not necessarily anything that gets taught by formal math courses, either, at least not directly. Rather, more advanced math leads to better abstract thought in general. So just take more math, whatever it is, and you'll be indirectly learning how to be a good programmer.
Re:The Second, If Not Both (Score:5, Interesting)
[...] The second is going to give you practical skills in programming -- a wide array of practical skills. The first is most likely going to give you some automata theory for computers but unless you're going into theoretical research, the second is the obvious answer. Graphics and games are all vectors, the web is becoming even more so with new browser rendering technologies. Rendering is all euclidean space transposed onto a two dimensional plane (screen) using points (pixels). Differentials are huge in the vision and image processing world and again, in graphics. This is your obvious selection[...]
I couldn't disagree more. There is no "obvious selection," because the OP didn't mention what type of programming interests him. If you're going to specialize in graphics or scientific computing, yes, the analysis course would be helpful. However, I find that branch of mathematics completely useless for the programming work that I do.
In more systems-oriented programming (e.g., OS, compilers, networking, databases), a strong background in algorithms, data structures, and graph theory is absolutely essential. If you start moving into security and cryptography, you need to understand modern algreba topics like number theory and group theory; having a solid foundation in set theory is a prerequisite for any of those topics.
[...] although I challenge you to take both [emph. added]. Also, look for courses on classes that blur the lines between stats/math and computer science. Like courses on error correcting codes or computer language design and theory.
On this point, we agree.
It's changed. Get more number-crunching. (Score:3, Interesting)
I have a MSCS from Stanford (1985), and the field has changed since then. Back then, it was all about discrite math - number theory, combinatorics, mathematical logic, computability, and proofs. There was no number-crunching at all in the curriculum. Of course, back then, an FPU was an extra-cost option on a PC. I've actually done automated program verification work. [animats.com] But outside of IC design (where formal methods are routinely used), there's not much of that going on now. Now, number-crunching has come to the fore.
In the 1990s, I spent several years on what turned into ragdoll physics for games. That's all about differential equations and number-crunching. I had a hard time switching over. But I finally got used to deterministic number-crunching. I have no mathematical intuition for it, though; I took it up too late in life.
Now, the leading edge of computer science is probabilistic number-crunching. Take a look at Stanford's CS229 - Machine Learning [stanford.edu] class. That's the technology that's driving AI now, and it's working across a broad range of domains. The logicians are out, and the statisticians are in.
Re:Study what you enjoy (Score:3, Interesting)
I can see no use for any of the math disciplines that you listed in the majority of user interfaces...just curious is all.
I personally have used statistics heavily when it comes to testing user interfaces - particularly with live users (as in, 'How intuitive is the interface?') It is good to have a solid understanding of what goes into statistics to make sense of the data you get from such tests to help further improve your interface.
I can't speak to the other ones (and am having difficulty coming up with a use for calculus). I personally feel that psychology plays a stronger role in UI design. That is from my own experience, however. YMMV.
Re:Physicists? (Score:5, Interesting)
Re:Physicists? Yes, really. (Score:2, Interesting)
I don't get why you'd want to mod me troll, since I don't see we disagree? Of course knowledge of basic linear algebra is an important part of the skill set of anyone who writes computer programs. But that's not the same as being able to write a linear solver from the top of your head and it's also not the only or most important aspect of software engineering.
As an example: during the job interview for my current job I was asked to spell out a program that calculates the intersection of 2 lines, to integrate expressions and other math trivia, over the phone. I couldn't do it though I was able to formulate the steps to do it offline. Other questions we're purely questions about basic C and Unix trivia, which I was able to answer easily though most if it was in fact irrelevant or obsolete facts for todays software engineering. The interviewer found it really strange I wasn't able answer 'easy math questions' while I did seem to 'know a lot about software engineering', completely backwards to my impression about the interview. After that interview I've implemented simulation software containing non-linear regression, all kinds of matrix stuff, dft's, half of Matlabs basic linear algebra tools, image processing and more. Not because I can tell you how to do an SVD but because of good software engineering practices, by and making efficient use of existing solutions to common problems. That's the difference between writing software on a larger scale as opposed to using computer programs to solve mathematical problem. It's also why I view so many math and physics-related software as 'badly engineered' (though often well-written), mostly because it's confusing, the documentation is minimal, absent or dense and incomprehensible, sensitive to misuse, inflexible, in other words: difficult to use safely in production software. All the math stuff I use in our code is wrapped into sensible and safe interfaces that are well-documented and easy to use, which is what allows us to rapidly build complex but robust software from it.
It's also for these reasons I think basic knowledge of applied mathematics and intricate knowledge of algorithms, data structures and development methodogies is more valuable than intricate knowledge of mathematics but only basic knowledge how to build and maintain quality software.
Good question--but too late for me! (Score:3, Interesting)
I got a Ph.D. in Philosophy back in '78 (am I going to have to specify that in four digits soon?), and one day when I was moaning around because I couldn't find a job with reasonable pay and even minimal dignity, a friend said to me, "go into computers, Vomact". I said something like, "huh? But I'm terrible at math!". He told me not to worry, "there's no math required, it's all logic". Overall, I've found that to be true. Basically, you need a mental tool-box to solve programming problems, and those problems have been mostly logic problems for me, so my tools worked just fine. I think that maybe studying mathematics gives you similar tools, but I've always suspected there's some kind of mathist prejudice at work in CS departments that require calculus as a prerequisite. I think they just put it on the list to act as a filter to keep people who should get an M.B.A. or something else trivial from wasting their time. But it's a filter I couldn't have passed. Luckily, there were very few formally trained programmers back in the early eighties, and someone like me could talk his way into a software job.
It's obvious, of course, that if you intend to write programs that actually use mathematics, then you'd better study math—if you're going to be a scientific programmer, for example, just as you'd better understand statistics if you want to write actuarial programs for insurance companies. In fact, depending on what kinds of software you design or write, there are a lot of things you might be called to know...and you can't know the list in advance, when you're still in school. Just be prepared to keep learning when you leave school—in fact, that's when the learning really starts.
No, I am not saying that studying maths is a bad idea or a waste of time. On more than one occasion, I've gotten essential insights into difficult programming problems that involved mathematical and geometrical understanding from mathematicians, so I'm quite prepared to respect their training. I just don't think it's a prerequisite for the job.
As others have pointed out, the article summary invites confusion by conflating computer science with programming. I dont' see why you need calculus for either, though.
Re:Both skillsets are needed (Score:5, Interesting)
It's quite shocking isn't it? And not just a "this code isn't very pretty" problem either, but the instability of the thing is remarkable.
Remarkable? Its legendary! ROOT has the dubious distinction of including the worst piece of programming I have ever seen. When adding some extract I/O objects in separate header files to a program, generating dictionaries etc. I suddenly had the linker complaining about duplicate symbols. After spending just over a day trying to figure out why and getting more and more confused I finally demonstrated that ADDING A COMMENT to the code fixed the problem!
Staggered that ROOT was somehow breaking the C++ standard in ways I had not even contemplated I spent another day tracking this down to an automatically named function which used the C preprocessor line number directive as the only variable part of its name. So, if you happened to have the classdef (IIRC) macro on the same line in two header files ROOT would generate identically named symbols. The result was that something as simple as adding or removing a comment could fix or cause a duplicate symbol problem!
Far, far worse than all that though was that when I submitted a nice bug report illustrating the problem and pointing out the macro which was at fault they claimed that this was NOT A BUG and did not need fixing! ARGH! I only ever use ROOT through the Python interface now since it shields you from so much of the pain....just not all of it unfortunately!