Ask Slashdot: Best Rapid Development Language To Learn Today? 466
An anonymous reader writes "Many years ago, I was a coder—but I went through my computer science major when they were being taught in Lisp and C. These days I work in other areas, but often need to code up quick data processing solutions or interstitial applications. Doing this in C now feels archaic and overly difficult and text-based. Most of the time I now end up doing things in either Unix shell scripting (bash and grep/sed/awk/bc/etc.) or PHP. But these are showing significant age as well. I'm no longer the young hotshot that I once was—I don't think that I could pick up an entire language in a couple of hours with just a cursory reference work—yet I see lots of languages out there now that are much more popular and claim to offer various and sundry benefits I'm not looking to start a new career as a programmer—I already have a career—but I'd like to update my applied coding skills to take advantage of the best that software development now has to offer. (More, below.)
Ideally, I'd like to learn a language that has web relevance, mobile relevance, GUI desktop applications relevance, and also that can be integrated into command-line workflows for data processing—a language that is interpreted rather than compiled, or at least that enables rapid, quick-and-dirty development, since I'm not developing codebases for clients or for the general software marketplace, but rather as one-off tools to solve a wide variety of problems, from processing large CSV dumps from databases in various ways to creating mobile applications to support field workers in one-off projects (i.e. not long-term applications that will be used for operations indefinitely, but quick solutions to a particular one-time field data collection need).I'm tired of doing these things in bash or as web apps using PHP and responsive CSS, because I know they can be done better using more current best-of-breed technologies. Unfortunately, I'm also severely strapped for time—I'm not officially a coder or anything near it; I just need to code to get my real stuff done and can't afford to spend much time researching/studying multiple alternatives. I need the time that I invest in this learning to count.
Others have recommended Python, Lua, Javascript+Node, and Ruby, but I thought I'd ask the Slashdot crowd: If you had to recommend just one language for rapid tool development (not for the development of software products as such—a language/platform to produce means, not ends) with the best balance of convenience, performance, and platform coverage (Windows, Mac, Unix, Web, Mobile, etc.) what would you recommend, and why?
Python (Score:1, Informative)
Really? No. If you want a job, learn Javascript. It's used frontend everywhere, now backend with node, and the pay is good.
Python + Qt (Score:5, Informative)
With Qt you can develop for desktop or mobile, with a GUI or not. With Python you can do simple scripting all the way up to full-blown apps. Once you become familiar with Qt you can also fallback to C++ if you need the performance. You also have the option using Qt's GUI as traditional widget or Javascript based Qt Quick.
Re:Python (Score:5, Informative)
There are two possible answers to the question: Python and Javascript.
Python is a general-purpose language, with a large number of user areas. It is your best bet for general applicability.
However, if you want to aim for the web market -- which, granted, is huge -- go with Javascript.
That's pretty much all you need to know to make your decision.
I agree Python (Score:5, Informative)
My vote is for Python. My reasons are that it'[s very good for the rapid part. There's also tons of libraries to do darn near everything under the sun (see pypi.python.org). Finally, one thing in their mantra is that readability counts. This means that you can pick up your project several months later and know what it does... maybe even someone else's! Try doing this with Perl or Ruby, and it's much harder.
Python works quite well on the UNIX like systems, decently on Windows, has good command line helper libraries (argparse or optparse), and has several really good web frameworks. Heck, you can use IronPython or Jython and mix into your .NET or Java code!
The biggest weak point is probably full GUIs. It's not that there's not any good ones, there's just not a good default one. TkInter is built-in, but it's based on Tcl/Tk, the interface isn't very Pythonic, and the end result isn't great. WxPython is good for a basic GUIs, but adding custom widgets is hard. PyQt and PySidehas a more complete collection of widgets, but it again is tough to add new widgets. PyGTK has the large collection of widgets, and widgets can be written in Python and become first class widgets even in other languages. The new kid on the block is Kivy, which is kind of like QML for Python. Kivy defines very low level functionality that builds up widgets, but it makes it easy to combine them together to make a complete widget. This sounds like a lot of work, but it turns out to not be as bad as you'd expect.
Also, PyDev, PyCharm, and WingIDE are all pretty amazing IDEs for Python.
Finally, there's a good amount of jobs asking for Python, especially in big cities.
Scala (Score:5, Informative)
Re:What's wrong with html and javascript? (Score:3, Informative)
Are you insane?
There is no way your shot-gun approach to debugging is ever faster or better than dropping JS into a debugger; having something like intellij hook into your browser and debug your code running in an IDE is by miles better and faster than what you suggest.
Especially if you are doing something difficult and not just trying to load an ajax request...
Global Interpreter Lock (Score:4, Informative)
Re:I agree Python (Score:2, Informative)
Like the OP, I also came up through university when everyone was learning Lisp (mostly Scheme) and C. Early in my career, I did a lot of distributed systems programming in C, and far too much scripting in SH for both multi-platform build systems and actual runtime systems (monitoring, data reformatting, etc.). I made a transition from developer to architecture/management roles and then eventually found my way back to a kind of mentoring and special projects role where I do a mix of rapid prototyping for new products and also one-off stuff where the outputs matter more than the resulting code. I reevaluated a lot of my background, skills, and biases throughout this period and have become pretty mercenary in how I choose tools.
I can also endorse Python for the OP's needs. You don't need to, and shouldn't, use it where a bit of BASH would suffice. You also need to realize Python diverges from Lisp in quite a few regards, so be willing to learn and adapt to its quirks. But it serves a nice purpose for that step where you wish you could apply an obvious algorithm or data structuring method that is beyond a line-record data stream, yet you don't have an existing C framework nor the budget/priorities to develop one yourself. In Python, you can quickly import a few existing modules and evaluate their fit your problems, getting the value of scripted/dynamic prototyping that is hard to accomplish using C. An IDE or even just the basic read-eval-print loop is very useful here. Usually, the available Python bindings for different packages are well-designed enough to easily mix-and-match in ways that would require vast amounts of glue code or data restructuring in raw C.
I've gotten a lot of mileage out of Python for cleaning and pre-processing CSV and JSON datasets, using the obviously named "csv" and "json" modules. I've more recently been using numpy and scipy modules to do numerical work and image processing experiments, where we needed some preliminary results to decide if and how to pursue more funding and launch a new project at work. In a few cases, I've shifted from CPython to pypy runtimes to execute a particular CSV or JSON data-processing step at scale, also mixing in bits of BASH to run parallel work on different data files, etc.
However, if you are doing very much manipulation of tabular data, I'd recommend learning a bit of SQL too. It is very easy to use Python for pre-processing to load data into a temporary SQLlite or PostgreSQL database and then use SQL directly to do the heavy-lifiting of data analysis. If you have to do statistical work on large datasets like this, also take a look at MADlib, which is a very interesting integration of PostgreSQL, Python, and C libraries to do a range of standard methods on large tables within the database. For example, principal components analysis (PCA), k-means clustering, and other data-mining/statistical methods become simple SQL queries using stored procedures defined by the MADlib package. I've used these on tables with 100M to 1B rows to good effect.
These things all give you the flexibility to compose new data pipelines, in the spirit of BASH/sed/grep/awk, but with more complex data formats and more intensive processing at each stage. You don't have to dump all your systems/architecture thought processes if you view them as a larger set of tools to integrate into a pipe. You just replace the basic Unix line-record stream with structured data files and/or database connections. However, you need to develop the ability to recognize and steer clear of the framework/stovepipe cults that seem to exist with all the newer programming languages. There seem to be a lot of people who want to build "pure" environments that disavow all methods from outside and suffer serious second-system effects, and this creates a lot of false leads when you are searching the web for tools and techniques.
Re:I agree Python (Score:4, Informative)
I've gotten a lot of mileage out of Python for cleaning and pre-processing CSV and JSON datasets, using the obviously named "csv" and "json" modules. ... However, if you are doing very much manipulation of tabular data, I'd recommend learning a bit of SQL too.
You may want to look into pandas [pydata.org] as a middle ground. It's great for sucking in tabular or csv data and then applying statistical analysis tools to it. It has a native "dataframe" object which is similar to database tables, and has efficient merge, join, and groupby semantics. If you have a ton of data then a database and SQL is the right answer, but for a decent range of use cases in between pandas is extremely powerful and effective.
Re:Seriously? (Score:4, Informative)
It takes special skills to program? Maybe if you are doing some rather complex operations, but in the same regard I wouldn't want to re-gear the transmission or rebuild the engine of a car while I'm perfectly capable of customizing other aspects of a vehicle. Programming is the same way, someone can be capable of doing something they want to do (run a website and manage the database; or script their everyday crap into a few lines of code) without being 'an uber hax0r' who understands OS theory at the assembly level and capable of dealing with the full range of network security threats.
Mythologizing programming is what leads to the nephew who knows a little html being assigned as the head of IT; after all that little html takes all that programming knowledge!
And since your opinion of other programmers is so low:
might I suggest that the D-K effect is in full show and, on behalf of all coders, hackers, code monkeys, keyboard jockies, and everyone who's ever touched a computer, may I ask, beg, and plead, that you to please never write another line of code again.