Ask Slashdot: Have You Read 'The Art of Computer Programming'? (wikipedia.org) 381
In 1962, 24-year-old Donald Knuth began writing The Art of Computer Programming, publishing three volumes by 1973, with volume 4 arriving in 2005. (Volume 4A appeared in 2011, with new paperback fascicles planned for every two years, and fascicle 6, "Satisfiability," arriving last December). "You should definitely send me a resume if you can read the whole thing," Bill Gates once said, in a column where he described working through the book. "If somebody is so brash that they think they know everything, Knuth will help them understand that the world is deep and complicated."
But now long-time Slashdot reader Qbertino has a question: I've had The Art of Computer Programming on my book-buying list for just about two decades now and I'm still torn...about actually getting it. I sometimes believe I would mutate into some programming demi-god if I actually worked through this beast, but maybe I'm just fooling myself...
Have any of you worked through or with TAOCP or are you perhaps working through it? And is it worthwhile? I mean not just for bragging rights. And how long can it reasonably take? A few years?
Share your answers and experiences in the comments. Have you read The Art of Computer Programming?
But now long-time Slashdot reader Qbertino has a question: I've had The Art of Computer Programming on my book-buying list for just about two decades now and I'm still torn...about actually getting it. I sometimes believe I would mutate into some programming demi-god if I actually worked through this beast, but maybe I'm just fooling myself...
Have any of you worked through or with TAOCP or are you perhaps working through it? And is it worthwhile? I mean not just for bragging rights. And how long can it reasonably take? A few years?
Share your answers and experiences in the comments. Have you read The Art of Computer Programming?
Unfortunately no and I have a reason (Score:4, Informative)
Unfortunately no and I have a reason:
Reading those books requires high degree of mathematical sophistication, particularly, knowledge of complex analysis, which I lack.
Re:Unfortunately no and I have a reason (Score:5, Informative)
Reading those books requires high degree of mathematical sophistication, particularly, knowledge of complex analysis, which I lack.
They're just algorithms textbooks. They're hard to read because of when they were written, and the accompanying style. More like pseudo-assembly than high-level pseudo-code.
But, hey, if you want to optimize your search algorithm that uses tape as storage, to take advantage of the new-fangled tape drives that can write backwards as well as forwards, it's the book for you! (Yes, that was really a thing, and an algorithm you'll find in Volume 3: Sorting and Searching.)
Personally, I don't think he does a great job explaining algorithms. I once needed to look up O(n) median for something, tried to understand it from Knuth, gave up on the cryptic text, and understood it right away from CLR (now CLRS). It is an exhaustive catalog, but it's not a great learning tool.
Re: (Score:2)
>CLR (now CLRS)
???
Common Language Runtime?
Calcium, Lime & Rust?
Re:Unfortunately no and I have a reason (Score:4, Informative)
The "R" is Rivest, who invented the algorithm for O(n) median, and some other stuff.
Re:Unfortunately no and I have a reason (Score:5, Interesting)
CLR is Introduction to Algorithms by Cormen, Leiserson and Rivest. The S in CLRS is for Stein, who joined the team for the 2nd edition. When CLR came out in 1990, it was hailed as the best algorithms textbook ever, and what an algorithms textbook should be, a huge jump in readability and clarity over the not wholly satisfying existing algorithms textbooks. It uses pseudocode, instead of a real programming language. Allowed the algorithms to be presented cleanly, without any boilerplate code, overhead, or worries about limitations, no need for tedious checks for array out of bounds, numeric overflow, or out of memory, or invalid input. Don't have to declare any variables, or figure out how many elements an array needs.
The Abelson and Sussman textbook, Structure and Interpretation of Computer Programs, uses LISP (actually Scheme). There are quite a few LISP fanatics who passionately feel it is still the best programming language made, citing such reasons as the simplicity of writing an interpreter for it. However, that textbook is pretty difficult. The authors didn't appreciate how hard recursion can be for many students to understand, and LISP and functional programming in general uses recursion so heavily it's the proverbial hammer for every nail of a programming problem.
Since then, programming languages have improved. Still not good enough for the textbook, but closer.
Re: (Score:3)
presented .... without any boilerplate code, overhead, or worries about limitations, no need for tedious checks for array out of bounds, numeric overflow, or out of memory, or invalid input.
Wait - did I read that correctly? "without any boilerplate code, overhead, or worries about limitations, no need for tedious checks for array out of bounds, numeric overflow, or out of memory, or invalid input" = improved textbook?
Aren't these the attack vectors used by malware and viruses today?
I think I'm lost. We use a newer shiny shiny that shows us to do something without showing it done safely and it's better because people will magically include the necessary safety checks and our new algorith
Re:Unfortunately no and I have a reason (Score:5, Insightful)
Re:Unfortunately no and I have a reason (Score:5, Informative)
presented .... without any boilerplate code, overhead, or worries about limitations, no need for tedious checks for array out of bounds, numeric overflow, or out of memory, or invalid input.
Wait - did I read that correctly? "without any boilerplate code, overhead, or worries about limitations, no need for tedious checks for array out of bounds, numeric overflow, or out of memory, or invalid input" = improved textbook? Aren't these the attack vectors used by malware and viruses today?
Everything you mentioned are supposed to be a given. A person who needs explicit indication of them are not at the level required to use a book like CLRS. I don't mean it as an insult, but as an observation.
Moreover, many of the checks you mention are handled by constructs and idioms that are language dependent. For example, boundary checking in C will be different from, say, Ada or Java, let alone something like Ruby or LISP.
Also, when you are stuying algorithms at that level, you are assumed to have a certain maturity that makes reference to such things irrelevant. Think of it like this: If you are learning how to solve quadratic equations, you do not need a lesson in adding fractions, do you?
Same principle applies here. When you are taking a book like CLRS, it is to study the mathematical properties of algorithms.
I would say that a there is a more hands-on book that directly addresses these concerns: O'Reilly's Algorithms in a Nutshell. This is a really nice pocket book.
Separation of Concerns (Score:3)
'supposed to be" ???
clearly you haven't been paying attention. All the attack vectors we currently have in code are because of the lack of coders, er, uh, sorry, I know you prefer the term 'developers' these days, don't do any-damn-thing about doing inputs testing or overflow, invalid input, etc.
So, don't climb on your high horse and lecture about how the 'best textbook' doesn't teach a damn thing about them, when that led to the code we now have to try to make secure.
So, also, if all that is, "... a given...", then why is none of it done reliably????
Ramble ramble, gurrr gurrr, hear me roar.
Oh, I've been paying attention. A significant chunk of my professional experience (22 years) has been involved in secure programming and in scanning and fixing vulnerabilities left by accident (or more often than not, by incompetent developers) in the commercial and defense sectors. I'm not exaggerating that my entire career has been devoted to fixing other people's fuck ups. So when I say something it is (most likely) because I have some experience in the matter.
Re: Unfortunately no and I have a reason (Score:3)
Re:Unfortunately no and I have a reason (Score:5, Interesting)
Personally, I don't think he does a great job explaining algorithms .... It is an exhaustive catalog, but it's not a great learning tool.
I agree with all of the above. There are better books. But criticizing TAOCP is like saying the emperor has no clothes. The series is difficult to understand, and everyone thinks it is their own fault for not being smart enough, rather than that the books actually aren't very good.
Disclaimer: I own the entire series, and keep them in my office to impress people, but I haven't actually opened them in the last 20 years. ... for the humor as well as the math. I keep it in my office to impress people too.
Additional Disclaimer: I loved Don's "Concrete Mathematics" book
Re:Unfortunately no and I have a reason (Score:4, Informative)
I constantly switch between the major C languages (not a major feat), C, C++, Java, C# and my resume has Visual Basic, Ada, Python, Perl, Assembly, FORTRAN, Lab Windows
A lot of these languages are kind of samey-samey; the mind expansion he's talking about comes from working in languages that use very different programming models that make you approach problem-solving in new ways. Try adding, say, Haskell (or SML or ELM), Forth, and Prolog to the list, for starters.
Re: Unfortunately no and I have a reason (Score:5, Insightful)
So you can't read them?
They are some of the most valuable books I'be ever owned. Whether during programming, developing FPGA logic or just trying to have a method of proving some of my algorithms. TAOCP is invaluable.
Computer science is still... well science and TAOCP is a cornerstone of computer science. You're confusing computer science and application programming.
There are certainly better places to learn the basics. Like for example, what is a red black tree? But TAOCP is where you understand them.
And I think his pseudo-computer concept was nifty. As for FORTRAN, all the fortran developers I know are generally slobs. They'd never touch these books.
Re: Unfortunately no and I have a reason (Score:5, Interesting)
I can't say I've ever read Knuth in the literary sense. It's more of the ultimate reference.
Thankfully, I no longer have to construct sorts, searches, random number generators, etc. every time I write a program, since modern-day language systems come pre-supplied. However, someone has to implement those pre-supplied algorithms, and that someone almost certainly referenced Knuth. He has provided a concise, well-explained collection of analyses, discussions and sample implementations of many of the most important functions needed by almost everyone in the software development field.
The downside of it is MIX. I understand the reason Knuth created MIX, but MIX is an assembly language and not even a commonly-used one (and intentionally so). I know that there are many who think that the purpose of computers is to run machine (assembly) language, but they're wrong.
The purpose of computers is to run software. How it runs the software is secondary. You can see this in the fact that some machines include in their instruction set specialized functions related to specific abstract features. The Prime Computer instruction set had machine-language implementations of the FORTRAN 3-way branch. I think it was Honeywell that had an OS task dispatcher function implemented as a machine-language instruction. The Inter iAPX432 was designed to run Ada, the IBM System/360 had the Translate instructions, and the IBM zSeries has a major subset of the Unix stdlib implemented as CISC instructions, including a few of the Knuth algorithms.
Keeping software as a set of bits corresponding to primitive functions is convenient for implementing von Neumann machines, but it should be realized that it's the means and not the end, and when scaled, tends to lose the abstract picture in the minutae. If minutae were the end of it, one should be programming microcode and manually switching gates - on many machines, the "machine language" is itself an abstraction interpreted via microcode.
I've often complained that one of the biggest annoyances in IT is that you can pop out a prototype GUI in a day or 2 and everybody thinks it should go to full production next Thursday, where no one in their right mind would ever expect a scale cardboard model of a building or bridge to be ready for use in such time.
But there's a similar break in education in that in most cases CIS courses use a "practical" language instead of an abstract one.
While assembly language was very common when Knuth wrote Volumes I-III, he was aware that there was no universal high-level language any more that there was a universal low-level one, which was another reason he invented MIX. Even academically-inclined languages such as Modula ultimately detoured into practical use, warping their abstraction. And, although ideological purity makes me projectile-vomit, when you're dealing with abstract concepts, I prefer to keep things truly abstract. Otherwise it warps one's approach to later solutions.
Djisktra did. in fact attempt an abstract high-level language in his "A Discipline of Programming", although it has some odd warts of its own. Personally, I'd vote for Ada, because while the last time I actually ran Ada, it nearly burned out the bearings on a mid-line IBM mainframe, it is the only common language I know where values and functions are assigned ranges and domains - an essential concept that is taught at the very beginning of differential calculus classes, but not nearly well-stressed enough in most CIS curricula. Were it otherwise, perhaps fewer satellites would have been destroyer.
Ada, incidentally, is part of the gcc/gnat toolset available on almost any Linux development system and modern-day PCs are considerably more powerful than a 1990 mid-level IBM mainframe (but then again, probably so are most cellphones), so anyone who's curious can explore.
Torn between reading and doing (Score:5, Interesting)
I read the first three books in University and did examples from the first two when I started debating with myself, friends and professors, is it better to have the ultimate reference or be able to create code on your own as the requirements come up?
Over the thirty plus years since, I'm happy to say that volume two and three have gotten pretty ratty as I've used them as references (along with "Programming in C", 2nd edition) so I feel like I've struck the right balance (for me) between reading them, using them as reference and creating my own code/algorithms.
Re: (Score:3)
Few developers are going to have to know how to really code, or what is really happening in the engine they are using.
And THIS is why I get paid nearly twice as much as all the people I work with who fit exactly this mold. Because when you do understand how the code really works, you can make software perform better than anyone believes. For example, 10,000 complex business logic transactions per second with database and third party interactions on 3 boxes of physical hardware with a total cost under $20,000. (Probably less if our ops people didn't insist on IBM branded servers.)
Daughter was a big fan when she was younger (Score:4, Funny)
Although she preferred his other works, like The Land Before Time and Anastasia.
May the Source Be With You (Score:2)
Read the first volume (Score:5, Interesting)
It's really great reading if you do stuff like program low-level (think C, Assembler), efficient programming or do stuff close to the hardware level (such as microprocessors). It describes the very low level of a program and a computer.
If you're into a higher level of programming (Java, C#, Python etc), unless you're building libraries for it, it is probably going to confuse you, most of the 'hard stuff' is (double precision, floating point, sorting and searching through lists ...) abstracted away. Obviously 'someone' has to know how it works in the end, someone has to write the compilers, I haven't started on the rest of the volumes because that's not "me".
You should understand how computers work before you start reading these, I've been in the 'business' for 20 years, I've read it 3 times just to get a basic grasp on the first volume.
Re: (Score:2)
It's also well worth the effort (and it is a lot of effort) to read the third volume, Sorting and Searching. The second volume (Seminumerical Methods) may be useful if you do certain kinds of work, but Fundamental Algorithms and Sorting and Searching are worth almost any professional programmer's time.
I have to admit I haven't bought 4A yet.
I really hope that Knuth is grooming someone to take over the work of completing the full set when he dies, or becomes unable to continue.
Re:Read the first volume (Score:4, Insightful)
It describes the very low level of a program and a computer.
No it doesn't. It describes the very low level of a program running on a computer from 30-50 years ago. The lessons that it teaches about algorithmic complexity are still valid, but the low-level stuff is not. Once you get to limits of the implementation, rather than of the algorithm, artefacts of caches in pipelines are far more important to performance. Not only will you not find, for example, Hopscotch Hash Tables in TAOCP, you also won't find an explanation of the underlying reasons for their performance.
Re:Read the first volume (Score:4, Funny)
It's a difficult read between full time work, family and social lives
Who let this guy in here? How'd he get past security?
Re: (Score:2)
good one
A SkyRim Read? Then Yes. (Score:5, Funny)
Absolutely. If you want depth, read it. (Score:2)
Parts (Score:5, Informative)
Re: (Score:3)
Anyway, my question is: what is MMIX, and how does it differ with MIX which was rather limited, even for the 6502 programmer I was at the time ?
MMIX is a 64 bit RISC-like processor that reflects where CPUs were headed in the 1990-2000 time frame. I found the algorithms coded in MMIX assembler to be much easier to understand than the original MIX versions. There are one or more simulators and tool sets available for it if you want to actually test some code.
huh? (Score:2)
Improved open source program. (Score:2)
I used Vol. 2 to improve the multiply algorithm in an open source program.
I have read some (Score:4, Interesting)
I have them. I have studied small parts of some of them. I have been delving into them over 30 years.
For day to day programming, I do not need or use the detail in those books.
At various times in the past, I have delved into library writing, and then they were very helpful, mostly in understanding issues and problems that I had not thought about. But I think time has moved on. Hardly anyone needs the details in those books, and in many cases, some classes of problems are well solved.
Looking back, I am glad that I studied some parts. But today I would not recommend them. Unless you really wanted to look back at history.
Yes, and it's good. but not gospel (Score:2, Informative)
It depends on what you mean by "work through it". Do all the exercises? Some are unsolved problems, so that's not terribly realistic.
There's nothing in the books that's not also discussed elsewhere (with the possible exception of the very thorough discussion of out-of-core sorting with tapes, which is a bit unusual these days), but it takes quite a few other books to equal the series.
I have read it at length, and it's definitely full of good stuff to know, but it really depends on your field. It's still
I have, not worth it (Score:5, Informative)
Don't get me wrong, Knuth is a genius. If you need to do deep research on sorting algorithms, definitely read it. If you want to do CS research and need to learn how to read research papers, its a good start. But you aren't going to get any deep insights on how to write a good program from it. Its too academic and far too focused on deep research. And even for the topics it does cover, unless you want to do research on how to really optimize the hell out of them you're better off using tutorials written for a more practical level.
Maybe (Score:5, Interesting)
I wasn't sure if I'd read 'em. I know a friend/colleague (who I regard highly) who has - and I think he thinks highly of them. But he also has terrible taste in movies.
A quick google search landed me at http://broiler.astrometry.net/... [astrometry.net]
I have not read it.
I've been coding professionally for 25-30 years, depending on how you count. I studied CS in college. I've read a few outstanding books on the subject since then.
I don't have the patience for these, and I suspect I'm not going to miss out on much.
On the other hand, I long ago came to the conclusion that I'm really not interested in low level code. Give me a nice high level language with nice high level functions and features and I'm a happy coder. That's not to say that I don't understand O notation or the costs behind the complexity - but it is to say that I know when to use a drill and when to use a power saw - but I don't want to build either of 'em.
Maybe you're into the nitty gritty. Or maybe you like bad movies.
Check your local tech library and see if you can check out a copy. Or ebay 'em for $20-40/volume. Or if the pdf strikes your fancy, maybe take the plunge.
Re: (Score:2)
For most development projects, I would agree that going into the nitty gritty can seem like overkill. But, having a core understanding of low level programming can make a huge difference in application performance. The third book "Sorting and Searching" should be required reading for anyone who plans on getting involved with databases, even if they only plan on being a dba.
You can do a lot with high level programming languages, but if you skip assembly or C programming for at least a background on what is h
Re: (Score:2)
No, you just have a crappy Internet connection. Thanks, kwerle!
I have read much of it, as I would an encyclopedia (Score:4, Interesting)
My wife and I each had a copy of the first three volumes when we married. Yes, there are female computer nerds. B-)
I first encountered it when assigned one of the volumes as a text back in 1971. Of course the class didn't consist of learning EVERYTHING in the volume. B-)
I use it from time to time - mainly as a reference book. Most recently this spring, when I needed a reference on a data structure (circular linked lists) for a paper. I've found it useful often when doing professional computer programming and hardware design (for instance, where the hardware has to support some software algorithm efficiently, or efficient algorithms in driver software allow hardware simplification).
I don't try to read it straight through. But when I need a algorithm for some job and it's not immediately obvious which is best, the first place I check is Knuth. He usually has a clear description of some darned good wheel that was already invented decades ago, analyzed to a fare-thee-well.
I only see him about once a year. He's still a sharp cookie.
Yes (Score:2)
I've read the 1973 editions, cover to cover. I skimmed a few fasciles. Haven't kept up since then.
Do I recommend it? You bet I do, just like I recommend Structure and Interpretation of Computer Programs, The Mythical Man-month, and The Psychology of Computer Programming. That's not to say you have to have read classics like these to be a good programmer, but if you haven't internalised a lot of the material that's covered in books like these, I question how much you care about what you do. And if you don't
Buy a set for the US Patent Office (Score:3, Insightful)
We should all chip in and get a complete set for the US Patent office. It might help them get rid of some bad patents they have issued over the years.
Or if you say you are using a computer in a patent app and don't cite Knuth as prior art for something you get tossed for that as well.
Not Read (Score:2)
While I haven't read the books, and doubt I ever will now, the content is similar to the uni course I did in computer science back in the 80s. Knuth covers everything in higher detail than I can recall being taught, but I'm pretty certain my foundations are just fine. It may just be that I'm forgetting some things too, it's been almost 30 years now for most of it.
I might go back and revise a topic or two in those books or a similar source if I felt I needed a refresher. For most cases though, what I can rec
Yes.... (Score:2)
Folks, we live in an age where programmers declare integers that are going to count from 1...10 as LONG INTEGERS, eating 8 bytes of RAM, where only 1 byte is needed.
We live in an age of cloud computing, load balancers, containers, and distributed databases with stored procedures. When code runs, you have no idea where it is running and how it is spread out over cloud services. Most of the time you don't even know what country the physical box is in.
I have a pure CS degree, but as long as we can keep makin
Re: (Score:3)
Yes but. I used to declare variables overly large as a kludge to help out when error-trapping was consuming too much time and I knew that the compiler wasn't good with overflows. So I'd do input error checking up to the point where it started to take too much time, then declare a variable larger than reasonable input would be, and then attempt to trap and reject input at a length between reasonable input values and the declared variable size. Declaring a variable just larger than the input buffer was one
TAOCP (Score:2)
Reference Material... (Score:2)
The modern version is much more of a reference than textbook. While exercises still are part the book, it really for testing knowledge. Also, the field has just exploded and the task is really daunting. Satisfiability is an example, the current fascicle is 320 pages, but a more in depth look at the problem could run to twice that much and more. And there is new work being done on the topic every day.
Frankly, less and less programmers will need it, as it is easier and easier to create and use algorithmic lib
Cover to cover? (Score:2)
I don't think I've read any college text book cover to cover. I open up TAOCP when I need to get into the theory on a particular topic.
The value of Knuth's volumes... (Score:2)
Re: (Score:2)
The interesting part is he doubled the reward for every error found, and surprisingly he hasn't had to write cheques for much money the last time I checked.
Yep, and got pencil-signed copies, too (Score:2)
Back when I was at AOL in Vienna, VA, there was a bookstore called Computer Literacy Bookstore, a few doors down from the headquarters of Ringling Brothers & Barnum & Bailey Circus (who annually would show off their elephant-de-jour).
I bought the first two editions there the moment I became aware of them. They're signed in pencil by Knuth himself. The fact that he used pencil I found amusing.
I bought the third edition, which was a huge, huge event as it was much anticipated, and enjoyed it better
No. (Score:2)
I did however read the Notepad help file.
Not much application (Score:2)
If you're working for Oracle and coding the Oracle database and are looking for an algorithm to squeeze a bit more performance out of the engine, go ahead and buy the book, you might find something in there. But most programmers are using sets and dictionaries in their chosen programming language that has a decent implementation of algorithms and won't be helped by some algorithm which might squeeze a few more cycles out of the computer but nobody will reward you for. The Knuth book is for high fliers and
Yes, most of the 3 volumes (Score:4, Informative)
I started reading them around 2001 and went through the three books, a little bit at a time. Went through most of the exercises with 30+ difficulty, but couldn't really solve all of them.
A lot changed to myself - back then, I was a newbie undergrad programmer with undergrad-level math skills. Fast forward 15 years, I went through grad school and then couple of years of industry experience. My main programming languages moved from C++/Java to VHDL, then moved on to SystemC and SystemVerilog, and back to C++ with a bunch of bash scripts.
So, did I get to use the knowledge that I gained from reading it? Not much, I didn't even have to write a single data structure or algorithm because there are perfectly good (or at least, good enough) libraries for most of the issues that I had to deal with. Neither did I have a good usage of the math courses I learned (remember things like Laplace transformation or L-U decomposition?), nor did most of the non-engineering courses I took helped much. Still, all of them helped shape myself on understanding the world and helped gaining problem-solving skills.
Would I recommend it to other people? Depends, if you find your data structure and algorithm textbook easy enough and you want more challenging stuff, TAOCP is a perfectly good motivator to train yourself to solve complex problems. However, I think there are other ways to train complex problem-solving - e.g., a lot of advanced math/physics textbooks. However, for people who tend to fall asleep once they see those weird characters (and would rather live with pseudo-assembly code) TAOCP is a much better solution.
If you want to learn practical programming skills, then don't bother reading.
Yes, I have. (Score:3)
Why not a survey? (Score:2)
As surveys go, it would be as good as most of the recent ones.
Anyway, I've never read even one volume of the series, though I'm pretty sure I consulted it at various times. It was certainly available in the university libraries where I was teaching or studying. Also I remember seeing it in the research library when I was supporting the researchers. However, I can't really remember any details after all these years. The place I should have been introduced to it was when I was earning my CS degree, but I don'
Re: (Score:2)
Just remembered another one. I think Knuth also collaborated on a textbook called Concrete Mathematics , which I purchased but never finished reading (so it isn't in my records). Pretty sure I gave it to one of my professors when I finished my last stint as a student...
Parts of it (Score:3)
It's not like a novel you read front to back. If you need an algorithm you look it up.
I had to write a math library for a DSP that didn't have compiler support yet. TAOCP came in handy then. The parts I did read I went over again and again and once again. It wasn't fun by any means.
Re: (Score:3)
Re: (Score:2)
Those dumb names, like the silly "MongoDB," were just created to encourage the hyping of their technologies, even though they are simplified and inadequate versions of what was earlier well-tested and battle-proven.
Re: (Score:3)
You think anything written by anybody 77 years old is relevant?
So I take it you think K&R is irrelevant as well.
Re: (Score:2)
I get laughed at when I suggest memcached because all the cool young programmers "know" that redis is where it's at.
I had to look up this "redis" thing. I saw on the front page that it supports geospatial data. Then I looked up what constitutes "geospatial data" as far as redis is concerned. Then I cried a little.
It might be a fine product at what it was designed for, but any time I see people throwing around big words that they don't understand, it's a safe assumption that what they're trying to sell me is a toy.
Re:Hell no (Score:4, Insightful)
Re:Hell no (Score:4, Insightful)
It isn't terribly complex now that geniuses like Knuth have spent literally decades simplifying it for you, sure. Step deeper into the world and you'll be truly amazed at how deep it is ... and likely staggered that it works as well as it does.
+1. Programming isn't terribly complex if you always do it with your training wheels on, and if you never write anything that hasn't been written a hundred times before.
Re: (Score:2)
Hmm. Have you taken Numerical Analysis?
It's quite clear that he (she, it?) has never studied computer science at all.
Re: (Score:2)
I have sometimes compared those who have studied computer science (as opposed to learning how to program) with those who have studied music. You can be a very successful programmer without any computer science just as you can be a very successful musician without music theory. Mastery of the advanced studies of your discipline will make you a better than mere
Re: (Score:2)
“Computer Science is no more about computers than astronomy is about telescopes.” - commonly attributed to Edsger Dijkstra, but disputed.
[...] Mastery of the advanced studies of your discipline will make you a better than merely someone who can just get the job done.
The caveat being here that a good portion of what goes for CS at universities is essentially "How to use the vast resources of a supercluster as a glorified pocket calculator". I had the dubious honour of suffering through four semesters of so-called CS at my uni, and I can attest that you can be an incredible computer scientist and still be unable to program even modestly simple applications. And that is said without even touching upon the vast difference between CS and software engineering (and the equall
simple concept can be as complex anything (Score:4, Insightful)
OP is being a bit flippant.
Conceptually, the idea of using alphanumberic characters to give computers instructions is "simple" and getting a computer to do basic operations is fairly simple with a good tutorial or guide.
The idea that the codebase for a web app like Yelp's website or a phone app like Snapchat is "simple" or "easy to learn" is of course patently ridiculous...I think it boils down to whether or not you give OP the benefit of the assumption.
Seriously OP really didn't say much other than, "No it is easy"
Re:Hell no (Score:5, Funny)
Nah, I've been programming longer than Knuth has, starting with machine language. You just need to think procedurally.
In your case, it sounds more like "sporadically".
Re: (Score:2)
Re:Hell no (Score:5, Funny)
110010001000 caught at work [youtube.com]
Re: (Score:2)
Wish I had mod points.
Re:Hell no (Score:4, Interesting)
I've been programming longer than Knuth has, starting with machine language.
If it's not a rude question, how old are you, exactly?
Re:Hell no (Score:4, Insightful)
Re: (Score:2)
I can program in 6502, Z80, x86 and 680x0 assembly languages. Knuth doesn't deserve any thanks.
Me too. Rodney Zaks gets my thanks.
I use TAOCP as a handy authoritative reference for algorithms in specs. It's not a great read though.
Re: (Score:2)
Re: (Score:2)
But can you program in Z80 and 6502 machine code? You really only need to know a few instructions, because the details are in the bits.
I wrote my own assembler and debugger for programming my Apple //e. I got fed up of typing in hex.
https://github.com/dj-on-githu... [github.com]
If you want to write a book, just do it (Score:3)
Sure, yeah, you could take a few weekend courses and bang out some stuff and possibly even find a job paying decent money.
But if you want to move up in the world you need to turn your hack and slash techniques into a refined art.
The kind of crap commodity programmers write is the stuff that skilled developers get paid a lot of money cleaning up or just re-implementing.
It the difference between dime store trashy romance novels and real actual novels. The different between the the Divergent movies and Hunger
Re: (Score:2)
Re: (Score:2)
The different between the the Divergent movies and Hunger Games.
Soo....it's basically the same?
Re: (Score:2)
The different between the the Divergent movies and Hunger Games.
And which of those isn't the dime store trashy novel?
Re: (Score:2)
Sure, yeah, you could take a few weekend courses and bang out some stuff and possibly even find a job paying decent money. But if you want to move up in the world you need to turn your hack and slash techniques into a refined art. The kind of crap commodity programmers write is the stuff that skilled developers get paid a lot of money cleaning up or just re-implementing. (...) If you want to work in the big leagues on important things, you need to be open to learning some things and respect the craft.
With all possible respect to all the CS experts of the world, that's not what they teach. Finding a good organization of your application that makes structures easy to break down, processes easy to follow and changes easy to implement doesn't involve deep, abstract mathematical formulations with optimal answers. It's about creating functional units (objects, layers, modules, services) with clear responsibilities that abstract away internal details, create well defined and narrow interactions, break up and e
Re: (Score:2)
With all possible respect to all the CS experts of the world, that's not what they teach. Finding a good organization of your application that makes structures easy to break down, processes easy to follow and changes easy to implement doesn't involve deep, abstract mathematical formulations with optimal answers. It's about creating functional units (objects, layers, modules, services) with clear responsibilities that abstract away internal details,
I'm not sure about your university, but my professors tried to teach that in addition to the math stuff.
Re: (Score:3)
Re:Hell no (Score:5, Insightful)
It's not complex if you merely want it to run, but if you want flexible, maintainable, and readable code, then it is complex.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Is that you Nike [nike.com]/Shia [youtube.com]? :P
Just do whatever, and expose it to the internet (Score:2)
> Programming isn't terribly complex. If you want to program, just do it.
You really, really should know better than that by now. In the 1980s, if you wanted to write a really crappy macro and use it on your computer, fine. Today, most software is exposed on the internet and runs on devices connected to networks that people depend on, networks that contain private information of one kind or another. "Don't worry about knowing what you're doing, just do whatever" is an extremely foolish approach.
Re: (Score:2)
Programming isn't terribly complex.
Awesome that you think so! Now, program some realtime flight surface control software for a fly-by-wire jet and sleep well knowing that your program will never, ever, kill anyone... (Or, substitute any other safety critical software you can think of - and theres a lot!)
"Programming" (by which I really mean software engineering) is one of the most complex activities in existence...
Re: (Score:2)
Programming isn't terribly complex.
Awesome that you think so! Now, program some realtime flight surface control software for a fly-by-wire jet and sleep well knowing that your program will never, ever, kill anyone... (Or, substitute any other safety critical software you can think of - and theres a lot!)
"Programming" (by which I really mean software engineering) is one of the most complex activities in existence...
Just because it can be doesn't mean that it always is. The avionics-software programmer is working very differently to the muh-first-website programmer. It can be complex like avionics, but it can also be simple like javascript text-adventures.
"Programming" can be, at times, one of the most complex activities in existence. It can also be, at other times, one of the simplest activities you can find paying high salaries. Lets not pretend that programming is always more complex than brain surgery. It can (on r
Re: (Score:2)
Driving a car isn't hard millions do it.
I've waited until I was 37-years-old to learn how to drive. My father wasn't going to teach me as a teenager to drive stick on his one-ton flatbed that he put a million miles on in ten years. Since Silicon Valley has a well-developed transit system, I got around just fine without having a need for a vehicle. One day my father abandoned his old car in my carport. I had not choice but to get my driver license and take possession of the car. Took me three years to find out about all the repairs that he didn't
Re: (Score:2)
NASCAR is easier than, say, driving in Amsterdam. In Amsterdam you find bikes and pedestrians on your path, and, Oh! The horror! Right turns!
Re: (Score:2)
>rich base classes
are complex.
Machine code is simple. There's much less to know,
Re: (Score:3)
Re: (Score:2)
Please rest assured that absolutely no-one wishes they were you.
Re: (Score:2)
Re: (Score:3)
CPU and memory are still very expensive, especially on the mobile market and efficient programming and memory management is still very relevant especially as large swathes of memory is becoming scarcer. But there are still plenty of people using microprocessors that have no more than a few MHz and several kilobytes of memory.
Even the Arduino libraries themselves are rife with examples of such 'bad' programming, some operations unnecessarily take many more cycles than necessary while using a simple example i
Re: (Score:2)
OP said other books have covered these needs better, in OP's opinion.
You do make a good point however, there will always be people cramming circuits into smaller and smaller things and some code has to run them.
Re: (Score:2)
Great analysis thanks
Re: (Score:2)
Re: (Score:2)
better yet, just steal it for awhile, the torrent with all four volumes is out there. Look it through and if it's worthwhile buy it.
Yarrrrrrr, yo ho yo h,o a pirates life for me....
Re: (Score:2)
interesting comment
Re: (Score:3)
My check from Knuth, for finding an arithmetic typo in Vol. 2, is for the amount of $5.16 which includes accrued interest.
Regarding the original question: These books are fantastic, but they are challenging and the bit-miserly focus does not map well to today's typical programming needs. They are frankly too detailed and difficult for many readers. My advice is have a look at them and decide whether you need them to complete your life. If so buy the set and enjoy! If not, you probably won't miss them.
Ju