Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Programming IT Technology

What Makes a Programming Language Successful? 1119

danielstoner writes "The article '13 reasons why Ruby, Python and the gang will push Java to die... of old age' makes an interesting analysis of the programming languages battling for a place in programmers' minds. What really makes a language popular? What really makes a language 'good'? What is success for a programming language? Can we say COBOL is a successful language? What about Ruby, Python, etc?"
This discussion has been archived. No new comments can be posted.

What Makes a Programming Language Successful?

Comments Filter:
  • by Anonymous Coward on Thursday May 29, 2008 @12:40PM (#23588017)

    is (mostly) consistent with itself
    With itself. As in, the language is (mostly) consistent with itself. If you're coding with it, (mostly) everything Just Works(tm) the way you expect it to once you've learned how a few areas work. Want to declare an internal class? Works the same as a normal class. Want to reference it elsewhere? OuterClass.InnerClass (so long as it's public). Want to extend that class? class Bingo extends OuterClass.InnerClass. Want to output it as a string? toString() is part of the base Object class, and once you see that, you realize everything has it. (mostly) Consistent with itself. That's not even getting into event handling (which is more the standard library and not the language, but still the same concept).

    Yes, of course applications (which are not Java itself) may act strangely or crash entirely when jumping a major version of Java*, but what are you expecting? Win 3.1 apps can act a bit strangely under XP, XP apps may show some issues with Vista, gcc 3.2-made binaries can get all urpy over gcc 4.0-made libraries, and OS 9 apps need an entire emulation layer to run under OS X. Upgrades can break things. That is not news.

    *: Well, okay, that's a bit of confusion from Sun's end. To most people, "Java 1.5" to "Java 1.6" implies a minor upgrade, when in fact it was a considerable amount of change involved.
  • by hackstraw ( 262471 ) on Thursday May 29, 2008 @12:48PM (#23588113)
    I have some java hate, but java today is not the java of 1997. Its core class libraries are complete and I would assume consistant.

    My first experiences with java were the stuff that ran like crap as the so called end-all-be all write once run anywhere GUI language. That is not true today. Java is now a middleware language. Its become glue, and more behind the scenes than it was back in the day.

    So, what makes a programming language successful? Well, of course, its success!

    No, seriously, today, a programming language becomes and _stays_ succesful if it meets these criteria. 1) Does it have a good user community and is is still used for new projects and not just "legacy" ones? 2) Does it have extensibility and interoperability? That is a BIG one. CPAN, libraries, JARs, APIs, all of those additional features determine a successful programming language.

    Today, the most successful programming languages are FORTRAN (its a science and engineering thing, and its not going away tomorrow), C/C++, JavaScript, Java, python, .NET, perl, (does SQL fit in here?), and I guess some ruby, I have little exposure to ruby, and its the newest kid on the block I listed, so the jury is still out on that one.

    Programming languages come and go. The way I see it, the real question, is how are we going to get any/all of the above languages take advantage of the trend towards distributed and SMP systems?

    _NONE_ of the languages listed there do this particularly well, and there have been TONS of new languages to fix these problems, but to date, we are left with threads, OpenMP, and MPI, and some lesser known languages like Erlang, Titanium, High Performance FORTRAN (or did they give up on that one?), and the like.

    I see programming going through a needed paridigm shift "Real Soon" (TM) to address these issues. Along with the development tools as well. Computers are bigger and more complex than they were yesterday, and the languages have not yet caught up to this complexity.

    Maybe Ada will come back to life and fix all of this? I don't think so.
  • by Anonymous Coward on Thursday May 29, 2008 @12:51PM (#23588177)
    Good old google. It has a cached copy, here: http://tinyurl.com/6qo4nh

  • Re:Aging Engineers (Score:5, Informative)

    by ardor ( 673957 ) on Thursday May 29, 2008 @12:54PM (#23588243)
    I mean 1,000 lines of C++ and C# compared... I'd expect to see fewer errors in the C# and less severe errors when I find them.

    Depends on your skills. C# is a safer environment, but C++ has immensely more expressive power. With modern and well-coded C++, these 1,000 lines may equal to 10-20,000 lines of C#/Java. Unfortunately, the ugly C++ syntax and its C cruft make unlocking the true advantages of C++ a black art.

    A trivial example is the STL. Java/C# containers don't come even close to the STL's power. Go further and look at Boost.MPL/Fusion/Proto, and you'll see stuff you simply cannot do with Java/C#.

    Well. If it were by me, Lisp would be king. But its not a perfect world :)
  • by Kentaree ( 1078787 ) on Thursday May 29, 2008 @12:57PM (#23588307) Homepage
    Java is designed to be backwards compatible (painfully so in fact). I can't say I've ever seen a problem with Java running an app compiled for an older version, and it's definitely not supposed to happen.
  • Oh No! (Score:2, Informative)

    by bledri ( 1283728 ) on Thursday May 29, 2008 @12:59PM (#23588347)

    Essentially this means that Python isn't really feasible to write any larger system and expect it to hold water over several years, or even decades.

    Quick, sell your Google stock!

    There is a lot of solid Python code out there. The best way to write Python is to also write unit tests. Which is a good practice in any language.

  • Web cashe (Score:1, Informative)

    by Anonymous Coward on Thursday May 29, 2008 @12:59PM (#23588355)
    Cashed web page [64.233.167.104]
  • by ardor ( 673957 ) on Thursday May 29, 2008 @01:00PM (#23588371)
    Repeat after me:

    C++ is NOT a "Microsoft language".
  • by peter303 ( 12292 ) on Thursday May 29, 2008 @01:01PM (#23588389)
    John McCarthy did. They were inventors of COBOL, FORTRAN and LISP - three languages still in use from the 1950s.
  • by klubar ( 591384 ) on Thursday May 29, 2008 @01:02PM (#23588413) Homepage
    Sorry, I should have said the .net languages/environment of which C++ is one.
  • The Graph (Score:3, Informative)

    by skeeto ( 1138903 ) on Thursday May 29, 2008 @01:03PM (#23588417)

    I found it interesting in this graph [littletutorials.com] in the article that, beginning around 2004, the popularity of Python closely follows the popularity of Delphi. Is this some kind of flaw in the methodology in how the data was collected, like too small a sample size (as Delphi has nothing to do with Python I think)? Or is there some overall pattern causing this? A few others are pretty close in some spots, too, in terms of first derivatives.

    Note that the server isn't really /.ed at the time I am writing this, but is throttling itself based on limitations of the hosting account (says only 10 concurrent servings). If you keep trying (aka refreshing the browser) you will get the image I linked to above.

  • by Em Ellel ( 523581 ) on Thursday May 29, 2008 @01:03PM (#23588431)

    > is (mostly) consistent with itself

    Are you serious? Where I work, we regularly use at least three Java applications, and each one requires a particular version of Java, none of which are the same. One of them requires Java 1.5, while another one will break completely if Java 1.5 is installed. It's a nightmare! And while yes, the version requirements may be the fault of the developers, the fact that it can happen at all is unacceptable.
    Erm, you must be running windows and don't have a sysadmin with a clue. You can run as many versions of JDK in parallel as you want and they will not interfere with each other. Its not rocket science, just set a few env variables. Thats actually one of my favorite things about Java - you are never tied to a "system" install of JVM - in fact you don't even need root privileges to install JVM.

    -Em
  • by Schadrach ( 1042952 ) on Thursday May 29, 2008 @01:09PM (#23588541)
    Need code that you KNOW has no errors aside from logic errors on the part of the programmer? Use Ada. That's really where Ada fits. You can do very little wrong without the compiler screaming at you and then failing to compile. Like as in, things that cause C "warnings" cause Ada to fail compilation until you fix it. Need to rapidly produce major components, and easily wrap C for the lower level, more performance intensive stuff? Use Python. No, really. That seems to be Python's main niche. It's a great language for writing large blocks of program logic very quickly, is poor at low-level stuff, but can trivially wrap C to do the things it is very poor at.
  • by p3d0 ( 42270 ) on Thursday May 29, 2008 @01:21PM (#23588715)

    I'm mainly a C hacker, but I don't get why people would prefer Python over Java.
    Assuming that's an honest question...

    I wouldn't write a large system with Python, but I use it a lot for quick (say up to 1 kLOC) programs because it has a clean, pseudocode-like syntax for expressing algorithms that is terse without being cryptic. It has enough error checking to help me avoid a lot of mistakes, but not so much that it slows down my hacking. It doesn't necessarily have the best regular expression syntax, or the best performance, etc. etc., but it suits me for the scripts I write.
  • by thepacketmaster ( 574632 ) on Thursday May 29, 2008 @01:32PM (#23588915) Homepage Journal
    If anyone thinks COBOL is a dead language, simply because the current majority of programmers don't know it, think again. COBOL is alive and well, with new programs being made for it every day. Mainstream programs running on wintel/unix systems don't use it, but it is the choice for governments/banks/large organizations. Why? Because it has a history, it is stable and reliable, and it is simpler for someone unfamiliar with a program to learn how it works. Every cheque that is processed today still goes through a sorter powered by a COBOL system.

    As the author mentions, a new programming language is only useful if it addresses a much needed gap in other languages. Java addresses the gaps of memory management, complexity, and cross-platform. Java now has a huge install base, and is serving as a relatively stable language that can do whatever the imagination desires. It's maturing every version, giving it more credibility. It's pretty tough to compete with that.

    Learning a new language simply to follow a trend is the sign of inexperience (not youth). Old or young doesn't play into it (although you feel it does). If a new programming language does the exact same thing as another, there is no point to migrate to it. Time and costs simply don't warrant it.

    I can agree that learning several languages is an advantage. It helps focus your understanding of programming concepts by having to deal with them in different ways. But if you look people that know multiple languages, the languages are probably going to be from different categories: a low-level language, a high-level language, a scripting language, a compiled language, a web-friendly language, etc.

    Btw, the languages I know (and the order I learned them in) are: BASIC, Assembler, C, Unix Shell scripting, C++, Perl, PHP, Java, COBOL. (Yes, I just recently learned COBOL for a new job)

  • Re:Aging Engineers (Score:3, Informative)

    by burris ( 122191 ) on Thursday May 29, 2008 @01:40PM (#23589051)

    The reality was, that the kids just wanted to pretend they were doing OOP. They still used straight C, they just created structs and organized functions in files as if they were classes. It was actually rather clever and made it easier to maintain.

    I was under the impression that this is how all quality contemporary software development in the C language was conducted. Maybe the kids weren't as clever as they were up to date.

  • by maxume ( 22995 ) on Thursday May 29, 2008 @01:43PM (#23589115)
    Are you talking about the section 10 labeled "Python Pre-2.1 did not have lexical scopes."?

    If so, your criticism is bizarre, the example is written to illustrate that "Python Pre-2.1 did not have lexical scopes.", not to illustrate the shortest way to rewrite the built-in sum function (you realize that right, that the idiomatic implementation of sum in python is the built in function?).

    The reason map and reduce aren't cared about is that most people have an easier time with list comprehensions. Your code:

            l = l.strip().split(',')
            l = map(lambda x: int(x.strip(), l)

    can be written as:

            l = [int(x.strip()) for x in l.strip().split(',')]

    in python 2.4 onwards. Obviously, you could put that on as many lines as you wanted. If you are worried about performance, generator expressions are very similar to list expressions but lazily evaluated:

            g = (int(x.strip()) for x in l)

    g would then create items as they are called for by some consumer (for instance, a for loop or a container object).
  • by Otter ( 3800 ) on Thursday May 29, 2008 @01:45PM (#23589151) Journal
    it is not possible to count python 3000 in amongst those languages with extraordinary power, because the developers - primarily guido - believe that the functional-language-based primitives (map, lambda, reduce, filter) are "unnecessary".

    1) Those capabilities will still exist in the base language, just with different syntax.

    2) If you want the old syntax, it will be available in a standard library.

    Save your hysteria for something genuinely catastrophic, like the loss of the print statement.

  • by Kupek ( 75469 ) on Thursday May 29, 2008 @01:46PM (#23589169)
    You've probably never done any coding in Python. Check out the book Dive Into Python (it's free and online): http://www.diveintopython.org/ [diveintopython.org] Even browsing through it will give you a better idea of what Python is good for.

    Personally, I think of Python as a prettier and more coherent version of Perl.
  • Re:Back to Basic (Score:2, Informative)

    by Anonymous Coward on Thursday May 29, 2008 @01:46PM (#23589171)
    No, he described strong typing. There is no implicit casting from one type to another going on in Python. Weak typing involves implicit casting of types under certain operators.

    Static and dynamic typing are orthogonal to strong and weak typing. Statically typed languages have their type integrity constraints checked at compile-time. Dynamically typed languages do at least some type checking at run-time, usually because resolving a given object's type is undecidable (until it is decided by actually running the code).

    Python has a Strong Dynamic typing system. So does Ruby. Except, this is just an implementation detail, since these languages support the actors model of object orientation, which means that Actors/Duck Typing is the appropriate model for the type system.
  • Re:Back to Basic (Score:2, Informative)

    by morgan_greywolf ( 835522 ) * on Thursday May 29, 2008 @02:00PM (#23589381) Homepage Journal
    "Strong typing" and "weak typing" mean a lot of different things to a lot of different people. Those who feel Python programs are difficult to maintain because they are not statically typed like Java or C don't get Python. Consider:

    >>> def func(a,b,c):
    ... return (a+b)*c
    ...
    >>> func(1,2,3)
    (the 'return' line is idented even if you can't tell :)) versus:

    >>> func("apples"," and oranges ", 3)
    'apples and oranges apples and oranges apples and oranges '
    But, also this:

    >>> func ("apples","oranges","grapes")
    Traceback (most recent call last):
    File "<stdin>", line 1, in <module>
    File "<stdin>", line 2, in func
    TypeError: can't multiply sequence by non-int of type 'str'
    What's important to understand is that there IS type checking, but you have to understand that ad-hoc polymorphism creates power. With power may come confusion, but that is the fault of the programmer, not the language.
  • by CarpetShark ( 865376 ) on Thursday May 29, 2008 @02:16PM (#23589645)
    Yes. For a mixture, D seems like the ideal option, but it's so lacking in libraries that it's next to useless for the moment.
  • Re:Beards (Score:5, Informative)

    by VGPowerlord ( 621254 ) on Thursday May 29, 2008 @02:27PM (#23589785)
    That article is basically a rip of this one [microsoft.co.il] by Tamir Khason. Heck, it's essentially a blatant copy of the 2004 version [alenz.org] of Tamir's article with some of the 2008 pictures thrown in!
  • by dk.r*nger ( 460754 ) on Thursday May 29, 2008 @02:28PM (#23589799)
    Set the JAVA_HOME environment variable to the path of the JRE you want to use. It's commonly done in a .bat file launching the application.
  • by tzot ( 834456 ) <antislsh@medbar.gr> on Thursday May 29, 2008 @02:28PM (#23589807) Homepage

    People are quite capable doing quick things in Java without pulling in giant bloaty enterprise frameworks. Plus Python is bloat, I think it's like 40M+ installed.

    As for banging out quick projects, I tend to do them in C or shell scripts because I know they will either become real projects or they need to be understood by all.

    Um. Great information. Python is like 40M+ installed. On Windows? Linux? Bloat compared to, say, Java? Do you include the quite extensive library under your 'bloat' characterization?

    As far as quickies go, a change of email software on a client site required the old mailboxes to be converted into maildirs. While their admin searched in the web for an appropriate program, I coded a script in 15 minutes (that's three versions of the script; one to convert to maildir, one to include saved mail folders from the user home directories, and one to make sure that already read messages remained marked as read for the new POP/IMAP software.

    It's obvious that everyone chooses their own tools. Python works for me on many levels. Sorry it doesn't for you.

  • by epukinsk ( 120536 ) on Thursday May 29, 2008 @03:07PM (#23590417) Homepage Journal
    I'm having similar questions, only wondering why people would prefer Ruby over Java... My understanding is that the main reason for choosing Ruby is to use it with Rails...

    I always thought Ruby was just popular because of Rails, but I've been starting to write more Ruby code lately, and I'm starting to see some benefits of the Language itself. Namely:

    Ruby is REALLY good for unit testing.

    You may or may not be into unit testing, but if you are, Ruby really shines here. It shines because everything is mutable in Ruby. If a class has a method defined, you can change it on the fly. This makes many developers cringe because it seems so messy: "What do you mean the behavior of the method changed in the middle of execution?!!!"

    And that's a well-founded fear. Changing object functionality on the fly when what you really need to do is design your objects better is BAD. VERY BAD. But there are a few places where it's just The Right Way To Write The Code. Like any language construct, *cough* GLOBALS *cough*, it can be used for good or evil.

    Where Ruby's mutability can be used for Good is unit testing. Specifically, for creating mock objects. Say you have a method A which relies on method B that you're already testing elsewhere. You want your test to tell you whether A is broken, not whether (A OR B) is broken.

    In most languages this is hard or impossible, because it means reaching inside the execution of A and temporarily changing the functionality of B. But in Ruby (with Rspec) it's easy:

    it "A should work without testing B" do
        SomeBigNastyObject.should_receive(:B).with('somedata').and_return('someresult')
        MyObject.A.should == 'whatever_a_returns'
    end

    Note that my specification of SomeBigNastyObject just has the normal definition of B. The should_receive method is added by Rspec, and that method CHANGES the functionality of B so that it just returns the right value.

    And if you write your test this way, the SomeBigNastyObject.B method is never called! It just works as if B returned 'someresult'. In addition, this test makes sure that that method WAS, in fact called, and called only once.

    This really is the way unit tests should be written, and Ruby does this really well. You can do it in Java with the Reflection API, but it's messier.

    Erik
  • Re:readability (Score:4, Informative)

    by Just Some Guy ( 3352 ) <kirk+slashdot@strauser.com> on Thursday May 29, 2008 @03:23PM (#23590613) Homepage Journal

    BASIC and PASCAL etc. are of the "functional programming" ilke

    BASIC and Pascal are procedural [wikipedia.org].

    as is SQL

    SQL is declarative [wikipedia.org].

    LISP, SCHEME and SMALLTALK were all developed when space was at a premium. so, you kept the source file small by using as obtuse-as-possible a syntax.

    That's not why they're that way [wikipedia.org] at all.

    Python's a great language, but that's no reason to get sloppy about the details.

  • Re:Aging Engineers (Score:3, Informative)

    by ardor ( 673957 ) on Thursday May 29, 2008 @03:35PM (#23590749)
    Stuff like this:

    http://boost-sandbox.sourceforge.net/libs/proto/doc/html/boost_proto/users_guide/examples/lazy_vector.html [sourceforge.net]

    Generic Programming in general is impossible in C# because of underpowered generics. Unless stuff like

    template
    void random(Container &container, Accessor const &accessor)
    {
        for (unsigned int i = 0; i (i) = typename Accessor::value_type(std::rand());
    }

    is possible in C#, this argument holds. (Note that this is an artificial example I came up with; for a good example of real-world usage, check out Adobe GIL [adobe.com]).

    Another example: http://www.gsse.at/multiprogramming/ [www.gsse.at]

    C++'s true power lies in its templates. Templates are turing complete, meaning that they for a meta language. Using this meta language, I can adapt code to a specific situation. For example, I can have compile-time polymorphism, which is very useful when there is enough information while compiling to choose the actual type. I can have a list of factory class types, and do a call like create_image (img); which gets compiled to the actual creator function ONLY. No virtual functions, the compiler can even inline without problems. Yes, a JIT could detect that function X has been used with the same parameters for 400 seconds, but this way, I can rationalize unnecessary runtime overhead right from the start. Yet another use was to generate code paths that only differed in pixel format type. I wrote a templated version, and iterated over a list of enums at compile-time. This helped a lot in being cache friendly while not requiring to clone tons of code. Using templates, one can write scientific computing code that rivals even Fortran in terms of performance. See: http://www.oonumerics.org/blitz/ [oonumerics.org]

    I know C# 3.0 has a functional core, and this is wonderful - many problems can be solved much easier and cleaner with functional style. Generic programming and metaprogramming are the things I sorely miss. I would really like to have a language that has all the strengths of C++, minus its weaknesses (most notably C legacy, hideous template syntax, #include files). So far, D is the closest one, but its not there yet. Also, C++ has an ENORMOUS momentum...
  • by leomekenkamp ( 566309 ) on Thursday May 29, 2008 @03:45PM (#23590923)

    Today, the most successful programming languages are (...), .NET, (...) (does SQL fit in here?) (...)
    .NET is not a programming language but a platform. SQL is a query language and not a programming lanugage (Turing machine stuff etc). Java does thread-handling quite decently, it is just too difficult to grasp for most programmers.
  • by clampolo ( 1159617 ) on Thursday May 29, 2008 @04:07PM (#23591245)

    Not knowing what exactly "COSA" was, I did the sensible thing and wiki'd it.

    I got the same wiki about sex addiction. After looking at his crackpot site I found out that this miracle called COSA is just a fancy name for graphical programming and data-driven languages (basically what LabView does.) Or what a schematic diagram does.

    Guy clearly never used a schematic diagram or he'd know how painful they are compared to doing the same thing in a normal language like VHDL. Need to add an extra input to one of your functions? UH OH!! Time to tear up all your connectors and sit for 1/2 hour redrawing lines. Same thing would take 5s (literally) in VHDL.

    Hopefully, the hospital this guy escaped from finds him and returns him to his cell.

  • Re:Back to Basic (Score:3, Informative)

    by shutdown -p now ( 807394 ) on Thursday May 29, 2008 @04:17PM (#23591401) Journal

    Please note that at all points in time the interpreter knows the type of variable a.
    No, it knows the type of value that 'a' has; but that can change at run-time. Variables themselves do not have types in Python.

    But yeah, the other poster got it right. It's really static vs dynamic typing rather than strong vs weak. Sorry for the confusion.

  • by RAMMS+EIN ( 578166 ) on Thursday May 29, 2008 @06:44PM (#23593413) Homepage Journal
    ``I don't think that Ruby is bad, not by a long shot. It's seems fairly
    decent and it doesn't seem to be lacking anything necessary. I'm just
    curious as to why someone would pick Ruby over some other language. I'm
    not quite understanding what the "killer app" of Ruby is. I'm not sure
    why this language had to be created.''

    Ruby was inspired by a number of good and useful programming languages, such as Lisp, Smalltalk, Perl, and Python. It combines the good features of those into a single language. The result is a language that is very powerful, has clean syntax, is easy to get started with, and allows a little code to do a lot of work.

    As for a killer app, I don't think there really is one. Ruby is and has always been a general-purpose language. It got its 15 minutes of fame with Rails, but that has since been cloned in other languages. Nowadays, Ruby is just good at many of the same things Perl is good at, and good at many of the same things Python is good at. I personally prefer Ruby, because it is a very well thought out language and, coming from Lisp, it feels natural to me. Still, I don't think Ruby is that much better that people should or will be switching in droves. It's an incremental improvement over already good and useful languages, not a revolution.

    Now, for some specifics.

    From Perl, Ruby takes first-class regular expressions. Many Ruby modules are also ports of Perl modules.

    From Python, Ruby takes clean syntax. There's quite a lot of competition between Ruby and Python, so I imagine there is a lot of "me too" on both sides.

    From Smalltalk, Ruby takes the object system (everything is an object, every object belongs to a class), and blocks (kind of like anonymous functions, but with special syntax). Blocks, especially, are very powerful, as you can use them to (almost, I have to say as a Lisp programmer) implement your own control structures; loops, iterators, etc.

    From Lisp, or maybe from other languages that have these features, Ruby takes garbage collection, anonymous functions, dynamic typing, call/cc, symbols as first-class values, printing of objects in Ruby-readable form, a read-eval-print loop (enter code, have it evaluated and the results printed) and probably a bunch of other features I am too tired to think of right now (but not macros, alas).

    Ruby also has exceptions, the printf family of functions, and a number of other features commonly found in modern programming languages.

    The kicker is that it has all this in a single, coherent language, with a syntax that is easy to understand and learn, and few great pitfalls. Mostly, whether you already know a programming language or not, you can just start coding in Ruby, and it will be easy. There aren't lots of irritating silly parentheses in Ruby, neither is there a difference between scalar and list context. No buffer overflows, no integer overflows, no memory leaks. You don't need to change the way you think, you don't need an IDE. It's easy to get started with, and yet doesn't suffer from problems that languages that are easy to get started with usually have: bad design, limited expressive power, only really being suitable to one domain, etc.

    Finally, some code, just for kicks:

    # Define an array with first names and one with last names
    first_names = [ 'Alice', 'Bob', 'Charlie', 'Deborah', 'Eve' ]
    last_names = [ 'Cooper', 'Jones', 'Smith' ]

    # Define a class Person
    # Each person has a first name and a last name
    # which have to be passed to the constructor
    # and can be accessed using an accessor
    class Person
    def initialize first_name, last_name
    @first_name = first_name
    @last_name = last_name
    end

    attr_accessor :first_name, :last_name
    end

    # Add a pick method to the class Array
    # The pick method returns an element from the array at random
    class Array
    def pick

  • Um, you're confused. (Score:2, Informative)

    by Estanislao Martínez ( 203477 ) on Thursday May 29, 2008 @10:00PM (#23595187) Homepage

    You don't seem to be very clear on the differences between strong vs. weak type systems, and static vs. dynamic types.

    Python is strongly typed. In every place where a Python program can have a type error, the result is defined: some sort of exception is raised immediately, and can be handled regularly through the exception handling system, or to propagate uncaught and kill your program immediately and in a clean fashion.

    C is weakly typed. Many kinds of type error can happen at runtime that will leave your program in an undefined state, which may continue indefinitely. E.g., you can do pointer arithmetic and read data from random locations in memory, and interpret it as any type you want instead of what it's supposed to be.

    Python is a dynamically typed language. A valid Python program may not contain enough information to infer the type of the values that are assigned to its variables at runtime; or, in other words, any variable may be assigned any value of any type, values in Python must be tagged with their types at runtime, and Python runtime systems need to check the types of arguments at runtime are compatible with the types of the operations applied to them.

    C is a statically typed language. The C compiler must be able to determine, at compilation time, the type of every variable, and every expression must type-check for a program to be valid. The way C ensures this is by requiring type declarations on every variable and function. However, that's not essential; languages like OCaml and Haskell are smart enough to infer the types of most of your variables and functions without declarations.

    Now, having said all that, let's quote you:

    Python assumes you know what the hell you're doing, so it won't throw errors if you create two variables, put an Int in each one, and do an Int operation on them, all without declaring a type...It'll figure out the type by context.

    The main problem with this statement here is that you're not specifying what you mean by "context" here. There are two kinds of context that could be relevant: (a) compilation context; (b) runtime context. As it turns out, Python (because of dynamic typing) is allowed to use (b) to figure out the type. However, a language like OCaml can also "figure out the type by context," in the very different sense that it can analyze the program well enough at compilation to figure out that the variables are bound to ints, without your telling it. (And to make it more complicated: an implementation of a dynamically typed language is not required to check types at runtime in every case. A technique used by optimizing compilers for dynamically typed languages is to perform incomplete type inference on programs, figure out the types of variables as best as possible, and use this information to remove runtime type checks where the compiler can prove that they will always succeed. See this blog post [lambda-the-ultimate.org] for examples of people who've tried to do this for Python in various ways.)

    However, if you try to multiply an Int by a String, it'll throw the same type errors any other strongly type language will. They call it "duck typing."

    The phrase "multiply an Int by a String" turns out to be ambiguous. Most of the people who responded to you thought about it in terms of syntax: "foo" * 7. The thing is that Python overloads the * operator, and dispatches it to different concrete operations, depending on the type of the arguments. What Python does do is, when applying a concrete operation, throw the type errors.

    But also, that is not what the term "duck typing" means. Read your own link. What you're mentioning is just plain old dynamic typing--raise a type error at runtime when the tagged type of the arguments does not match the type required by the operation.

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...