Ask Slashdot: Becoming a Programmer At 40? 314
New submitter fjsalcedo writes "I've read many times, here at Slashdot and elsewhere, that programming, especially learning how to program professionally, is a matter for young people. That programmers after 35 or so begin to decline and even lose their jobs, or at least part of their wages. Well, my story is quite the contrary. I've never made it after undergraduate level in Computer Science because I had to begin working. I've always worked 24x4 in IT environments, but all that stopped abruptly one and a half years ago when I was diagnosed with a form of epilepsy and my neurologist forbade me from working shifts and, above all, nights. Fortunately enough, my company didn't fire me; instead they gave me the opportunity to learn and work as a web programmer. Since then, in less than a year, I've had to learn Java, JavaScript, JSTL, EL, JSP, regular expressions, Spring, Hibernate, SQL, etc. And, you know what? I did. I'm not an expert, of course, but I'm really interested in continuing to learn. Is my new-born career a dead end, or do I have a chance of becoming good at programming?"
I agree (Score:3, Interesting)
Go for it. The only one that should be telling you what you can or can't do is yourself.
If you have a passion for something you will enjoy it and may become very good at it.
Full Steam Forward (Score:5, Interesting)
Re:Good for you! (Score:0, Interesting)
Nailed it.
In America, the real way is to start you own business. The older you get, the more important that is.
When your 40 and thinking about a new career track, you have already fallen off the latter and in the HR Imbeciles mind are fatally damage goods.
If you want to make it, you have to take what crap work you can, while bootstrapping up your OWN business.
Risky, you better believe it, but calling out the odds, it is about the only thing left.
Since most of the people around here look at wage slavery as their holy grail, you had best Mod this -1.
Re:Go for it (Score:5, Interesting)
Re:Good for you! (Score:5, Interesting)
One of the best programmers I've ever worked with started as an accountant and became a programmer in his 40s first with ASP and then with PHP. What he lacked in advanced knowledge he made in spades up by being careful and methodical. He never tried to show off and when he designed something it was generally right the first time and out of the 20 programmers in our office he had by far the lowest bug count.
Re:Good for you! (Score:5, Interesting)
The one thing though that I have over them is experience, caution, and patience. I have the ability to do something right the first time even though it takes me longer. They are faster but it takes them more tries to get it right and many times my one try is much faster than their 10 tries. You've got to use what you have to your advantage. If my boss needs something done quick-and-dirty style he asks one of the younger people. If it needs to be perfect he asks me. We all have a place here and by combining all of our strengths together as a team we kick some serious ass.
Re:it's at a dead end (Score:4, Interesting)
They have been saying this since the 60s, yet people still seem to be writing code. What seems to happen is, byt the time a computer catches up with a major development pattern, developers are already off to the next pattern of development.
I mean, an operating system basically does what we would have called programming 40 years ago, writing instructions to the processor, calculations, etc. The nature of programming has changed since then, as it will over the next 40 years. I could see there being an application that models relevant data, builds interfaces, and maybe even makes them look nice. But I doubt that will be the way we interact with computers by the time they can do it.
http://www.amazon.com/What-Computers-Still-Cant-Artificial/dp/0262540673 [amazon.com]
This book is one of the first, best discussions about the major challenges that AIs face. The articles about ambiguity tolerance really tell you all you need to know to understand this point. While AIs are pretty awesome at this point, they really do rely on clustering algorithms and normative pattern analysis to construct the facts they operate on. It's useful as a means of understanding the world, but it's not really the same as what most people would call 'judgement' and it's certainly not the way people work in the world.
I have a theory about why AI will never replace coders. Once a machine gets to the point where it can handle the tasks of a coder, it becomes commonplace. People strive for more, technology is necessarily an innovation market. Eventually something new comes along, it takes decades to come to grips with it. During that time, people are the ones working out what's useful and interesting.
In other words, it's all a cycle, and machines are constantly catching up by automating what we did before. They never lead, which is why we have coders.