Ask Slashdot: Modern Web Development Applied Science Associates Degree? 246
First time accepted submitter campingman777 writes "I am being asked by students to develop an associates of applied science in modern web development at my community college. I proposed the curriculum to some other web forums and they were absolutely against it. Their argument was that students would not learn enough higher math, algorithms, and data structures to be viable employees when their industry changes every five years. As part of our mission is to turn out employees immediately ready for the work force, is teaching knowledge-based careers as a vocation appropriate?"
Teach the fundamentals (Score:5, Insightful)
The fundamentals never change. With a solid base, there is nothing a programmer can't do.
An AA program focused on what will get them hired today is exactly what will not get them hired tomorrow.
Re:Not a good idea (Score:4, Insightful)
2) Like any other degree, the point is to get the piece of paper. You're hoping that the degree shows that people are smart enough to learn a new language with an understanding of how the language of their particular platform works in general. Web development is a lot less based in hard math/logic in general than most other forms of development. You don't train a nurse to perform open heart surgery like they're some kind of cardiologist, thus you don't need to train a javascript developer to write assembly or know advanced calculus.
Product of communite college reporting in (Score:4, Insightful)
Sure (Score:5, Insightful)
This assumes 'web development' refers to web-based applications, not just informational webpages.
This is likely to be an unpopular opinion to many, but I don't see the huge barrier here.
I've been working as a software developer for nearly 20 years now, going from games programming to business apps to web development and machine learning. In that whole time, I can count only a small handful of times when I've ever had to exhibit mathematical skills more complex than trivial algebra. Oh sure, in college, they made me write my own compilers, I had to write my own vector math routines for my ray tracer, and so on, and I consider these valuable learning experiences. However, in the real world, where I'm employed and make money, I use software libraries for those sorts of things.
When it comes to data structures, the languages of employers today, java and c#, provide me with the majority of structures and optimized-enough algorithms to manipulate them. I don't have to do a big-O analysis and determine if my data patterns will be better served by a skip-list than a quicksort, because we just throw memory and cpu at that anyway!
The point is, if you spend 1-2 years learning to write software - not computer science theory - you'll be ready to enter the workforce. Sure, you're not going to be someone creating those frameworks, you're not going to be an architect, but you'll be able to use them. A few years of real world problems and google at your finger tips, and it's likely you'll have learned enough to start tackling those harder problems.
Here's a list of what I'd prioritize before computer science theory, in regards to employment:
- Proficient in SQL and at least one database type
- Familiar with IDEs, source control, bug/task trackers, automated builds and testing, debugging tools and techniques.
- Ability to work in a group software project.
- Exposure and participation in a full blow software development life cycle (SDLC) from reading, writing, evaluating requirements, coding, debugging, QA, unit testing, the oft-overlooked documentation, etc. Include at least something waterfall and something agile-ish.
- Expert with HTML & CSS, javascript, and awareness of javascript libraries and frameworks.
I don't think I need to explain the value of any of these, and these practical concerns trump high level concepts like discrete mathematics or heuristic design for the entry-level developer.
Change? In the web? Not really. (Score:5, Insightful)
Javascript and HTML haven't changed all that much. CSS? It's getting to the point where change is slowing down. Web architectures have been stable for years.
Nobody in real life uses higher math in front-end web development. They might use multiplication and division to do layouts. It's debatable whether anyone actually uses algorithms. Data structures would be handy, but it's also arguable whether web developers actually understand them or not - especially if you talk to any DBA about how website A uses the RDBMS.
Web frameworks would be handy. There are general things about frameworks that don't change.
What would be good would be some discussion around the process of building a website, from customer requirements to deployment. How to choose a technology, payment processor, server technology, etc.
Re:Not a good idea (Score:5, Insightful)
i agree that anything that's relevant now will not be 2 years from now
-sarcasm on-
Yeah, remember back in the 90's when html, javascript, java, etc. were important for web developers? All long forgotten now.
Not to mention all the OOP languages that were all gone within 2 years of being introduced--like C, C++, etc.
-sarcasm off-
Do you people ever actually read what you type?
Re:Not a good idea (Score:3, Insightful)
Changes every 5 years? (Score:4, Insightful)
When I was in college, one of my computer science professors told us that everything he was teaching us would be obsolete by the time we graduated. However, the concepts behind what we were learning would be valuable our entire career. Sure enough, I've never used the exact code in the exact language he taught us, but the generic concepts behind that work in almost any language I program in.