What Makes a Programming Language Successful? 1119
danielstoner writes "The article '13 reasons why Ruby, Python and the gang will push Java to die... of old age' makes an interesting analysis of the programming languages battling for a place in programmers' minds. What really makes a language popular? What really makes a language 'good'? What is success for a programming language? Can we say COBOL is a successful language? What about Ruby, Python, etc?"
Re:I don't really get the Java hate around here (Score:1, Informative)
Yes, of course applications (which are not Java itself) may act strangely or crash entirely when jumping a major version of Java*, but what are you expecting? Win 3.1 apps can act a bit strangely under XP, XP apps may show some issues with Vista, gcc 3.2-made binaries can get all urpy over gcc 4.0-made libraries, and OS 9 apps need an entire emulation layer to run under OS X. Upgrades can break things. That is not news.
*: Well, okay, that's a bit of confusion from Sun's end. To most people, "Java 1.5" to "Java 1.6" implies a minor upgrade, when in fact it was a considerable amount of change involved.
Re:I don't really get the Java hate around here (Score:3, Informative)
My first experiences with java were the stuff that ran like crap as the so called end-all-be all write once run anywhere GUI language. That is not true today. Java is now a middleware language. Its become glue, and more behind the scenes than it was back in the day.
So, what makes a programming language successful? Well, of course, its success!
No, seriously, today, a programming language becomes and _stays_ succesful if it meets these criteria. 1) Does it have a good user community and is is still used for new projects and not just "legacy" ones? 2) Does it have extensibility and interoperability? That is a BIG one. CPAN, libraries, JARs, APIs, all of those additional features determine a successful programming language.
Today, the most successful programming languages are FORTRAN (its a science and engineering thing, and its not going away tomorrow), C/C++, JavaScript, Java, python,
Programming languages come and go. The way I see it, the real question, is how are we going to get any/all of the above languages take advantage of the trend towards distributed and SMP systems?
_NONE_ of the languages listed there do this particularly well, and there have been TONS of new languages to fix these problems, but to date, we are left with threads, OpenMP, and MPI, and some lesser known languages like Erlang, Titanium, High Performance FORTRAN (or did they give up on that one?), and the like.
I see programming going through a needed paridigm shift "Real Soon" (TM) to address these issues. Along with the development tools as well. Computers are bigger and more complex than they were yesterday, and the languages have not yet caught up to this complexity.
Maybe Ada will come back to life and fix all of this? I don't think so.
Re:Article is slashdotted, can someone post a copy (Score:1, Informative)
Re:Aging Engineers (Score:5, Informative)
Depends on your skills. C# is a safer environment, but C++ has immensely more expressive power. With modern and well-coded C++, these 1,000 lines may equal to 10-20,000 lines of C#/Java. Unfortunately, the ugly C++ syntax and its C cruft make unlocking the true advantages of C++ a black art.
A trivial example is the STL. Java/C# containers don't come even close to the STL's power. Go further and look at Boost.MPL/Fusion/Proto, and you'll see stuff you simply cannot do with Java/C#.
Well. If it were by me, Lisp would be king. But its not a perfect world
Re:I don't really get the Java hate around here (Score:2, Informative)
Oh No! (Score:2, Informative)
Quick, sell your Google stock!
There is a lot of solid Python code out there. The best way to write Python is to also write unit tests. Which is a good practice in any language.
Web cashe (Score:1, Informative)
Re:Perhaps a better measurement than /. popularity (Score:5, Informative)
C++ is NOT a "Microsoft language".
Grace Hopper & John Backus didnt have (Score:4, Informative)
Re:Perhaps a better measurement than /. popularity (Score:3, Informative)
The Graph (Score:3, Informative)
I found it interesting in this graph [littletutorials.com] in the article that, beginning around 2004, the popularity of Python closely follows the popularity of Delphi. Is this some kind of flaw in the methodology in how the data was collected, like too small a sample size (as Delphi has nothing to do with Python I think)? Or is there some overall pattern causing this? A few others are pretty close in some spots, too, in terms of first derivatives.
Note that the server isn't really /.ed at the time I am writing this, but is throttling itself based on limitations of the hosting account (says only 10 concurrent servings). If you keep trying (aka refreshing the browser) you will get the image I linked to above.
Re:I don't really get the Java hate around here (Score:5, Informative)
Are you serious? Where I work, we regularly use at least three Java applications, and each one requires a particular version of Java, none of which are the same. One of them requires Java 1.5, while another one will break completely if Java 1.5 is installed. It's a nightmare! And while yes, the version requirements may be the fault of the developers, the fact that it can happen at all is unacceptable.
-Em
Re:Off the top of my head? (Score:5, Informative)
Re:Off the top of my head? (Score:3, Informative)
I wouldn't write a large system with Python, but I use it a lot for quick (say up to 1 kLOC) programs because it has a clean, pseudocode-like syntax for expressing algorithms that is terse without being cryptic. It has enough error checking to help me avoid a lot of mistakes, but not so much that it slows down my hacking. It doesn't necessarily have the best regular expression syntax, or the best performance, etc. etc., but it suits me for the scripts I write.
Re:Wow, I am a little Amazed (Score:3, Informative)
As the author mentions, a new programming language is only useful if it addresses a much needed gap in other languages. Java addresses the gaps of memory management, complexity, and cross-platform. Java now has a huge install base, and is serving as a relatively stable language that can do whatever the imagination desires. It's maturing every version, giving it more credibility. It's pretty tough to compete with that.
Learning a new language simply to follow a trend is the sign of inexperience (not youth). Old or young doesn't play into it (although you feel it does). If a new programming language does the exact same thing as another, there is no point to migrate to it. Time and costs simply don't warrant it.
I can agree that learning several languages is an advantage. It helps focus your understanding of programming concepts by having to deal with them in different ways. But if you look people that know multiple languages, the languages are probably going to be from different categories: a low-level language, a high-level language, a scripting language, a compiled language, a web-friendly language, etc.
Btw, the languages I know (and the order I learned them in) are: BASIC, Assembler, C, Unix Shell scripting, C++, Perl, PHP, Java, COBOL. (Yes, I just recently learned COBOL for a new job)
Re:Aging Engineers (Score:3, Informative)
I was under the impression that this is how all quality contemporary software development in the C language was conducted. Maybe the kids weren't as clever as they were up to date.
Re:I don't really get the Java hate around here (Score:5, Informative)
If so, your criticism is bizarre, the example is written to illustrate that "Python Pre-2.1 did not have lexical scopes.", not to illustrate the shortest way to rewrite the built-in sum function (you realize that right, that the idiomatic implementation of sum in python is the built in function?).
The reason map and reduce aren't cared about is that most people have an easier time with list comprehensions. Your code:
l = l.strip().split(',')
l = map(lambda x: int(x.strip(), l)
can be written as:
l = [int(x.strip()) for x in l.strip().split(',')]
in python 2.4 onwards. Obviously, you could put that on as many lines as you wanted. If you are worried about performance, generator expressions are very similar to list expressions but lazily evaluated:
g = (int(x.strip()) for x in l)
g would then create items as they are called for by some consumer (for instance, a for loop or a container object).
Re:I don't really get the Java hate around here (Score:4, Informative)
1) Those capabilities will still exist in the base language, just with different syntax.
2) If you want the old syntax, it will be available in a standard library.
Save your hysteria for something genuinely catastrophic, like the loss of the print statement.
Re:Off the top of my head? (Score:5, Informative)
Personally, I think of Python as a prettier and more coherent version of Perl.
Re:Back to Basic (Score:2, Informative)
Static and dynamic typing are orthogonal to strong and weak typing. Statically typed languages have their type integrity constraints checked at compile-time. Dynamically typed languages do at least some type checking at run-time, usually because resolving a given object's type is undecidable (until it is decided by actually running the code).
Python has a Strong Dynamic typing system. So does Ruby. Except, this is just an implementation detail, since these languages support the actors model of object orientation, which means that Actors/Duck Typing is the appropriate model for the type system.
Re:Back to Basic (Score:2, Informative)
Re:Off the top of my head? (Score:3, Informative)
Re:Beards (Score:5, Informative)
Re:I don't really get the Java hate around here (Score:5, Informative)
Re:Off the top of my head? (Score:2, Informative)
Um. Great information. Python is like 40M+ installed. On Windows? Linux? Bloat compared to, say, Java? Do you include the quite extensive library under your 'bloat' characterization?
As far as quickies go, a change of email software on a client site required the old mailboxes to be converted into maildirs. While their admin searched in the web for an appropriate program, I coded a script in 15 minutes (that's three versions of the script; one to convert to maildir, one to include saved mail folders from the user home directories, and one to make sure that already read messages remained marked as read for the new POP/IMAP software.
It's obvious that everyone chooses their own tools. Python works for me on many levels. Sorry it doesn't for you.
Re:Off the top of my head? (Score:3, Informative)
I always thought Ruby was just popular because of Rails, but I've been starting to write more Ruby code lately, and I'm starting to see some benefits of the Language itself. Namely:
Ruby is REALLY good for unit testing.
You may or may not be into unit testing, but if you are, Ruby really shines here. It shines because everything is mutable in Ruby. If a class has a method defined, you can change it on the fly. This makes many developers cringe because it seems so messy: "What do you mean the behavior of the method changed in the middle of execution?!!!"
And that's a well-founded fear. Changing object functionality on the fly when what you really need to do is design your objects better is BAD. VERY BAD. But there are a few places where it's just The Right Way To Write The Code. Like any language construct, *cough* GLOBALS *cough*, it can be used for good or evil.
Where Ruby's mutability can be used for Good is unit testing. Specifically, for creating mock objects. Say you have a method A which relies on method B that you're already testing elsewhere. You want your test to tell you whether A is broken, not whether (A OR B) is broken.
In most languages this is hard or impossible, because it means reaching inside the execution of A and temporarily changing the functionality of B. But in Ruby (with Rspec) it's easy:
it "A should work without testing B" do
SomeBigNastyObject.should_receive(:B).with('somedata').and_return('someresult')
MyObject.A.should == 'whatever_a_returns'
end
Note that my specification of SomeBigNastyObject just has the normal definition of B. The should_receive method is added by Rspec, and that method CHANGES the functionality of B so that it just returns the right value.
And if you write your test this way, the SomeBigNastyObject.B method is never called! It just works as if B returned 'someresult'. In addition, this test makes sure that that method WAS, in fact called, and called only once.
This really is the way unit tests should be written, and Ruby does this really well. You can do it in Java with the Reflection API, but it's messier.
Erik
Re:readability (Score:4, Informative)
BASIC and Pascal are procedural [wikipedia.org].
SQL is declarative [wikipedia.org].
That's not why they're that way [wikipedia.org] at all.
Python's a great language, but that's no reason to get sloppy about the details.
Re:Aging Engineers (Score:3, Informative)
http://boost-sandbox.sourceforge.net/libs/proto/doc/html/boost_proto/users_guide/examples/lazy_vector.html [sourceforge.net]
Generic Programming in general is impossible in C# because of underpowered generics. Unless stuff like
template
void random(Container &container, Accessor const &accessor)
{
for (unsigned int i = 0; i (i) = typename Accessor::value_type(std::rand());
}
is possible in C#, this argument holds. (Note that this is an artificial example I came up with; for a good example of real-world usage, check out Adobe GIL [adobe.com]).
Another example: http://www.gsse.at/multiprogramming/ [www.gsse.at]
C++'s true power lies in its templates. Templates are turing complete, meaning that they for a meta language. Using this meta language, I can adapt code to a specific situation. For example, I can have compile-time polymorphism, which is very useful when there is enough information while compiling to choose the actual type. I can have a list of factory class types, and do a call like create_image (img); which gets compiled to the actual creator function ONLY. No virtual functions, the compiler can even inline without problems. Yes, a JIT could detect that function X has been used with the same parameters for 400 seconds, but this way, I can rationalize unnecessary runtime overhead right from the start. Yet another use was to generate code paths that only differed in pixel format type. I wrote a templated version, and iterated over a list of enums at compile-time. This helped a lot in being cache friendly while not requiring to clone tons of code. Using templates, one can write scientific computing code that rivals even Fortran in terms of performance. See: http://www.oonumerics.org/blitz/ [oonumerics.org]
I know C# 3.0 has a functional core, and this is wonderful - many problems can be solved much easier and cleaner with functional style. Generic programming and metaprogramming are the things I sorely miss. I would really like to have a language that has all the strengths of C++, minus its weaknesses (most notably C legacy, hideous template syntax, #include files). So far, D is the closest one, but its not there yet. Also, C++ has an ENORMOUS momentum...
Re:I don't really get the Java hate around here (Score:3, Informative)
Re:Opinions Are Like @ssholes (Score:3, Informative)
I got the same wiki about sex addiction. After looking at his crackpot site I found out that this miracle called COSA is just a fancy name for graphical programming and data-driven languages (basically what LabView does.) Or what a schematic diagram does.
Guy clearly never used a schematic diagram or he'd know how painful they are compared to doing the same thing in a normal language like VHDL. Need to add an extra input to one of your functions? UH OH!! Time to tear up all your connectors and sit for 1/2 hour redrawing lines. Same thing would take 5s (literally) in VHDL.
Hopefully, the hospital this guy escaped from finds him and returns him to his cell.
Re:Back to Basic (Score:3, Informative)
But yeah, the other poster got it right. It's really static vs dynamic typing rather than strong vs weak. Sorry for the confusion.
Re:Off the top of my head? (Score:4, Informative)
decent and it doesn't seem to be lacking anything necessary. I'm just
curious as to why someone would pick Ruby over some other language. I'm
not quite understanding what the "killer app" of Ruby is. I'm not sure
why this language had to be created.''
Ruby was inspired by a number of good and useful programming languages, such as Lisp, Smalltalk, Perl, and Python. It combines the good features of those into a single language. The result is a language that is very powerful, has clean syntax, is easy to get started with, and allows a little code to do a lot of work.
As for a killer app, I don't think there really is one. Ruby is and has always been a general-purpose language. It got its 15 minutes of fame with Rails, but that has since been cloned in other languages. Nowadays, Ruby is just good at many of the same things Perl is good at, and good at many of the same things Python is good at. I personally prefer Ruby, because it is a very well thought out language and, coming from Lisp, it feels natural to me. Still, I don't think Ruby is that much better that people should or will be switching in droves. It's an incremental improvement over already good and useful languages, not a revolution.
Now, for some specifics.
From Perl, Ruby takes first-class regular expressions. Many Ruby modules are also ports of Perl modules.
From Python, Ruby takes clean syntax. There's quite a lot of competition between Ruby and Python, so I imagine there is a lot of "me too" on both sides.
From Smalltalk, Ruby takes the object system (everything is an object, every object belongs to a class), and blocks (kind of like anonymous functions, but with special syntax). Blocks, especially, are very powerful, as you can use them to (almost, I have to say as a Lisp programmer) implement your own control structures; loops, iterators, etc.
From Lisp, or maybe from other languages that have these features, Ruby takes garbage collection, anonymous functions, dynamic typing, call/cc, symbols as first-class values, printing of objects in Ruby-readable form, a read-eval-print loop (enter code, have it evaluated and the results printed) and probably a bunch of other features I am too tired to think of right now (but not macros, alas).
Ruby also has exceptions, the printf family of functions, and a number of other features commonly found in modern programming languages.
The kicker is that it has all this in a single, coherent language, with a syntax that is easy to understand and learn, and few great pitfalls. Mostly, whether you already know a programming language or not, you can just start coding in Ruby, and it will be easy. There aren't lots of irritating silly parentheses in Ruby, neither is there a difference between scalar and list context. No buffer overflows, no integer overflows, no memory leaks. You don't need to change the way you think, you don't need an IDE. It's easy to get started with, and yet doesn't suffer from problems that languages that are easy to get started with usually have: bad design, limited expressive power, only really being suitable to one domain, etc.
Finally, some code, just for kicks:
Um, you're confused. (Score:2, Informative)
You don't seem to be very clear on the differences between strong vs. weak type systems, and static vs. dynamic types.
Python is strongly typed. In every place where a Python program can have a type error, the result is defined: some sort of exception is raised immediately, and can be handled regularly through the exception handling system, or to propagate uncaught and kill your program immediately and in a clean fashion.
C is weakly typed. Many kinds of type error can happen at runtime that will leave your program in an undefined state, which may continue indefinitely. E.g., you can do pointer arithmetic and read data from random locations in memory, and interpret it as any type you want instead of what it's supposed to be.
Python is a dynamically typed language. A valid Python program may not contain enough information to infer the type of the values that are assigned to its variables at runtime; or, in other words, any variable may be assigned any value of any type, values in Python must be tagged with their types at runtime, and Python runtime systems need to check the types of arguments at runtime are compatible with the types of the operations applied to them.
C is a statically typed language. The C compiler must be able to determine, at compilation time, the type of every variable, and every expression must type-check for a program to be valid. The way C ensures this is by requiring type declarations on every variable and function. However, that's not essential; languages like OCaml and Haskell are smart enough to infer the types of most of your variables and functions without declarations.
Now, having said all that, let's quote you:
The main problem with this statement here is that you're not specifying what you mean by "context" here. There are two kinds of context that could be relevant: (a) compilation context; (b) runtime context. As it turns out, Python (because of dynamic typing) is allowed to use (b) to figure out the type. However, a language like OCaml can also "figure out the type by context," in the very different sense that it can analyze the program well enough at compilation to figure out that the variables are bound to ints, without your telling it. (And to make it more complicated: an implementation of a dynamically typed language is not required to check types at runtime in every case. A technique used by optimizing compilers for dynamically typed languages is to perform incomplete type inference on programs, figure out the types of variables as best as possible, and use this information to remove runtime type checks where the compiler can prove that they will always succeed. See this blog post [lambda-the-ultimate.org] for examples of people who've tried to do this for Python in various ways.)
The phrase "multiply an Int by a String" turns out to be ambiguous. Most of the people who responded to you thought about it in terms of syntax: "foo" * 7. The thing is that Python overloads the * operator, and dispatches it to different concrete operations, depending on the type of the arguments. What Python does do is, when applying a concrete operation, throw the type errors.
But also, that is not what the term "duck typing" means. Read your own link. What you're mentioning is just plain old dynamic typing--raise a type error at runtime when the tagged type of the arguments does not match the type required by the operation.