Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Programming IT Technology

What are the Next Programming Models? 540

jg21 writes "In this opinion piece, Simeon Simeonov contemplates what truly new programming models have emerged recently, and nominates two: RIAs and what he calls 'composite applications' (i.e. using Java, .NET or any other programming language). He notes that Microsoft will be trying to achieve RIAs in Avalon, but that it's late out of the gate. He also cites David Heinemeier Hansson's Ruby on Rails project as showing great promise. 'As both a technologist and an investor I'm excited about the future,' Simeonov concludes. It's a thoughtful piece, infectious in its quiet enthusiasm. But what new models are missing from his essay?"
This discussion has been archived. No new comments can be posted.

What are the Next Programming Models?

Comments Filter:
  • FP (Score:5, Funny)

    by tigersha ( 151319 ) on Wednesday August 10, 2005 @02:54PM (#13288111) Homepage
    Functional Programming, not First Post!
    • Mod parent up. (Score:2, Insightful)

      by moultano ( 714440 )
      Functional programming is awesome, and I'm thoroughly convinced that it will take over just about everything its feasible for it to take over. There is nothing like the feeling of writing a program, having it type check, and not having to test it because you can look at the code and tell that it proves its own correctness.
    • by MarkEst1973 ( 769601 ) on Wednesday August 10, 2005 @03:07PM (#13288239)
      Paul Graham has written extensively on how languages are becoming more and more like one from yesteryear: LISP.

      See Beating the averages [paulgraham.com] for a well-written and thoughtful essay.

      In a nutshell, languages themselves vary in power. No one disputes that. All things being equal, you should generally choose the most powerful language you can all the time. As we move more and more to server-hosted software, your choice of language is incredibly important because a) it's your choice, not forced on your by being the language of the OS and b) it can be a huge competitive advantage.

      Matz (Ruby's creator) acknowledges ripping off ideas from Lisp (but putting a friendlier face to it). Python is Lispy. Javascript has been called Lisp in C's clothing. These are all functional languages, or can be used functionally.

      Graham noted how all languages are trending more towards Lisp in terms of features (see the essay linked above). Want further proof? C# 2.0 is getting lexical closures. Innovation from Microsoft! These were available in Lisp for 30 years, javascript for 10 (since it was created), they're in Perl 5, Ruby, I can go on...

      If languages continue to become higher and higher level, wouldn't we need to investigate this weird AI language from 1958 and see what features it doesn't have in order to do more meaningful research? 'cause these days, all the "new" features of today's languages are decades old...

      • by Atzanteol ( 99067 ) on Wednesday August 10, 2005 @03:21PM (#13288335) Homepage
        LISP proved one thing. It doesn't matter what features your language has, if it has a crappy syntax nobody will ever use it.

        (I'd (stab (my (eyeballs out)) (if I needed)) (to look)(at LISP) all ) day)
        ))))))))))))))))))))

        Obligatory 20 closing paren's that inevitably appear...
        • by llamaguy ( 773335 ) on Wednesday August 10, 2005 @03:25PM (#13288366)
          Q: How do you know when you've achieved Lisp Enlightenment? A: When the parenthesis vanish.
          • by GCP ( 122438 ) on Thursday August 11, 2005 @02:37AM (#13292263)
            For newbies to Lisp, the parentheses don't vanish because of some mystical enlightenment. They vanish primarily because you've written your code according to a standard that specifies how it is to be indented. You parenthesize correctly when writing then simply ignore the parens when reading and look at the indentation level.

            Of course I'm doing some handwaving here about the writing it correctly part. Until you memorize the major idioms, you'll often experience starting something with a single paren when it really needs to start with two, for example, and you'll get weird behavior that ends up driving you to randomly adding and removing parens until it seems to work. Admittedly that's a bit of a hurdle at first, but after some experience, that part gets easy. (Like glancing at for (int x=0; x10; x++) and reading "do it ten times" without having to think about it. A lot of people forget how much thinking a newbie has to do to parse such an expression the first few times.)

            The real problem with Lisp isn't the parens. Once you get over the initial hurdle, you just look at the indentation. The problem is that dev platforms these days are so much more than just a language. The basic concepts underlying a Lispy language are almost timeless. The whole rest of the dev system, though, has a shelf life of about a decade or less, after which time the way it is made available, the libraries, the editors you have to use, the string model, the constraints it's optimized for, the compromises it has made, its interaction with other technologies, etc., are all out of touch with current realities. Such is Lisp today.

            (Paul Graham once seemed like the guy who could rejuvenate Lisp, but each year that passes makes that less likely. Speaking of out of touch with current realities.... Even Microsoft's secret projects are more open.)

          • Reminds me of a post I saw on Usenet (r.h.f.?) many years ago; I can't find a reference to it any more, but it was someone claiming to have a copy of the LISP source code to SDI (Reagan's Star Wars project, from way back). They proved it by showing the last page of the code:

            Slashdot won't let me post it, but it was a solid page of ))))'s.
        • by SimHacker ( 180785 ) * on Wednesday August 10, 2005 @03:32PM (#13288412) Homepage Journal
          If you're afraid of parenthesis, then you'd better not use XML! It has TWICE as many parens as Lisp. You should get a job flipping burgers or something, instead.

          -Don

        • The best way of debugging lisp is to keep adding closing parantheses until the interpreter stops giving you errors...
        • by Coryoth ( 254751 ) on Wednesday August 10, 2005 @04:06PM (#13288659) Homepage Journal
          LISP: Lots of Irritating Silly Parentheses.

          In practice it has a very clean and elegant syntax though. If your editor doesn't do bracket matching you might have a few issues, but then what sort of half assed editor are you using?

          Besides you can always try ML or Haskell which are much more pure functional than LISP and have hardly any parentheses (which I actually find occasionally irritating).

          Jedidiah.
        • ... and the irony is that the parentheses aren't necessary. camlp4 [jambon.free.fr] is a macro language for extending ocaml [ocaml-tutorial.org] and it shows that you don't need to express the language unnaturally just to allow macros.

          Rich.

      • by Tumbleweed ( 3706 ) * on Wednesday August 10, 2005 @03:25PM (#13288361)
        If languages continue to become higher and higher level, wouldn't we need to investigate this weird AI language from 1958

        "Doh! Why do we need all these _new_ languages? Everyone knows programming languages were perfected in 1958. It's a scientific fact!" :)
      • Matz (Ruby's creator) acknowledges ripping off ideas from Lisp (but putting a friendlier face to it). Python is Lispy. Javascript has been called Lisp in C's clothing. These are all functional languages, or can be used functionally.

        How do you define "functional language"? The key features of functional languages are that they (a) they reduce, or entirely eliminate, side effects, (b) have functions as first-class objects, (c) provide support for function currying, and (d) provide lambda expressions. No

      • "Javascript has been called Lisp in C's clothing."

        So the ways Javascript differs from C are due to being LISP like? I find it hard to imagine a more damning indictment of a language.

        "All things being equal, you should generally choose the most powerful language you can all the time"

        But all things are not equal. Being able to use that power is important too. Sure, LISP is powerful, in theory. But writing it makes me yearn for the straightforward simplicity of C++!

        LISP fanatics bug me. If everyone just a
    • DP (Score:3, Insightful)

      by SimHacker ( 180785 ) *
      Declarative Programming, not Data Processing!

      -Don

    • Erlang (Score:3, Interesting)

      I used Erlang professionally some, and liked it, but I have some doubts as well. It did not make me 10 times more productive, nor my code error free. It was not quite as good as a "scripting language" in terms of productivity, I felt, although it runs quite quickly, and like the concurrency model a lot.

      In any case, I was left with a feeling of "yeah, I like this and would use it again, but it's not something that is going to wipe the floor with older models".

      Also, I have some doubts as to how much FP "Sca [dedasys.com]
  • by bahwi ( 43111 ) on Wednesday August 10, 2005 @02:55PM (#13288114)
    Nothing is permanent. However, after so long you're gonna start getting rehashed methods. It's like a big circle everyone is running around in, looking for the absolute best. Yes, there are ones better than others, but there is no perfect one. Need OO for a simple 10 line php script? Hell no, unless you're relying on a lot of 3rd party libraries. Need Ruby on Rails for a statistics generator with no front end what so ever? Nope. It all changes, but there is some good stuff we take along the way. But I don't think we'll ever find something that is just "perfect", more of a never ending quest to find the better one, and to stay on top of all the ones from the past.
    • Well, for web development (God, do I now have to call this "RIA development"?) I found a diamond in the rough.

      It turns out there's this Python-based application server/templating language called SkunkWeb (http://www.skunkweb.org/ [skunkweb.org]) which seems to be the Holy Grail for me of, well, a Python-based web framework that doesn't [webwareforpython.org] completely [zope.org] suck [cherrypy.org] (Okay, I know 1995 and CGI was awesome and everything, but no one should be writing "print '<html><head>'..." statements within Python code to make web pages, an
      • Wow. Calling out to components and embedding tags with code. You're right, the win is that this way you can separate logic from presentation in a much cleaner manner than other web development frameworks.

        Well, other than some, that is...

        <cfinvoke component="foo.Users" method="getSome" returnvariable="q" />
        <table>
        <cfoutput query="q">
        <tr><td>#username#</td></tr>
        </cfoutput>
        </table>

        ColdFusion's only been doing this sort of thing for ye

      • by sammy baby ( 14909 ) on Wednesday August 10, 2005 @05:01PM (#13289153) Journal
        So, in that model, you have an abstracted version of your database logic mixed in with the presentation stuff. Meh, pass. RoR defers the "which database objects will I need on this page?" question to the controller. This allows you to put only the barest minimum of stuff in the actual template for the web page: just exactly enough to get the presentation right:
        <table>
        <%= render_collection_of_partials, "user_list", @users %>
        </table>
        In your users_controller.rb file:
        def list
        @users = Users.find_all
        end
        And in a seperate file called _user_list.rhtml:
        <tr><td><%=h user_list["username"] %></tr></td>
        Of course, if you prefer, you can iterate over the list right in the page, but if you're doing more than a single line or have some hairy presentation html in there, you have the option of just dumping it in another file, as demonstrated.
      • MVC? (Score:3, Insightful)

        by 5n3ak3rp1mp ( 305814 )
        I think the most promising thick-client app development model is the Model-View-Controller [wikipedia.org] paradigm, as seen in such well-designed app frameworks as Cocoa [apple.com] for OS X, and of course Ruby on Rails [rubyonrails.com], and although I see Skunkworks improving the "typical" drudgery of web-app dev, I would wonder what it provides in the way of code management when it comes time to test your controller without worrying about how the view renders it or the model stores it.

        And I know this is a personal preference and all, but... Python'
  • by erotic piebald ( 449107 ) on Wednesday August 10, 2005 @02:55PM (#13288116)
    No new 'paradigms' until we get all the other 'salvations' under control.
  • by Anonymous Coward on Wednesday August 10, 2005 @02:55PM (#13288121)
    Who are the next programming models?
  • by G3ckoG33k ( 647276 ) on Wednesday August 10, 2005 @02:57PM (#13288137)
    That's the first one I learned. Now I'm in to the lasagna model, with nice layers. Anything beyond that? Well, not me.
  • by khendron ( 225184 ) on Wednesday August 10, 2005 @02:58PM (#13288152) Homepage
    Here's a new model who can program:

    "Prior to being crowned Miss Universe 2005 [missuniverse.com], Natalie was a motivational speaker, model and a fundraiser. She recently received a Bachelor's Degree in Information Technology Management and Marketing from Ryerson University..."
    • if there's any reason for the aliens to wipe us earthlings off the face of the galaxy, it's for having the audacity to call our beauty contests "miss universe". I mean, really, how typically presumptious.

    • Information Technology Management means you want to control the techies, but are to lazy/not smart enough (all math and stuff, OOOOO)/not willing/too smart (let other people do the hard work), but certainly not that you can program. I have worked with enough of these grads to know that in general they are a disaster, and then try to tell you how to do your work. My average reaction: Go and do a real study please, and do not come back after you finish it .
    • She recently received a Bachelor's Degree in Information Technology Management

      I highly doubt she can program.
    • A "Bachelor's Degree in Information Technology Management and Marketing" qualifies her to be a programmer how exactly?

      Sounds like a PHB degree to me...
    • Technology Management and Marketing

      Nooooooooo!!!
    • by tb3 ( 313150 )
      Sorry, Ryerson isn't really a university, anyway. It's a community college, known to the locals as 'Rye High', that was only recently granted 'university' status. I doubt much has changed other than the name.
  • What about Small (Score:3, Interesting)

    by mu22le ( 766735 ) on Wednesday August 10, 2005 @02:59PM (#13288163) Journal
    I have heard marvels of Embryo [enlightenment.org] Enlightenment version of SMALL [compuphase.com]
  • by kinkadius ( 882692 ) <.moc.liamg. .ta. .edaknik.> on Wednesday August 10, 2005 @03:01PM (#13288181) Homepage
    the Cocoa/Objective-C implementation might be worth talking about, especially as how it has evolved from it's roots in next step.
    • Actually, the Cocoa hasn't gone that far. Bindings are new, and nice. Core Data is a lot like EOF (although more document-centric). Everything else has been around since OpenStep (although it's slightly more refined).

      NeXTStep was a lot more primitive - no Foundation, just AppKit and C-based libraries.

      It really is depressing watching demos of C# and .NET/Mono, and seeing them being touted as new and shiny, and seeing how far they are behind where NeXT was a decade ago.

  • How about... (Score:3, Insightful)

    by spikexyz ( 403776 ) on Wednesday August 10, 2005 @03:01PM (#13288182)
    ...we stop creating new languages and use what's out there to do something useful for a bit.
    • This is more than just a little insightful.

      How many times do we have to re-invent the wheel? How many languages promised re-usability? How many object oriented class libraries were written that couldn't be effectively re-used?

      --jeff++
  • Not truely new (Score:5, Insightful)

    by Frans Faase ( 648933 ) on Wednesday August 10, 2005 @03:02PM (#13288191) Homepage
    That a certain technology is a hype does not mean that it is new. These are not really new programming models. And whether we should be happy about them, I don't know, because they seem to make thing more complicated then they are already. I wonder how long it will take until we will see some programming models that are more specification oriented, then just being another type of implementation oriented way of programming.

    In a specification oriented programming model, you specify the behaviour, not all the million little steps that are needed to perform it. A specification oriented programming model is independent of the underlying techniques, such a networking protocols and marshalling techniques. I think such a specification oriented programming model should be data oriented, meaning that data is the starting point, not an event driven GUI front-end, as it is now with most programming models.

    • The thing that bugs me is that there are so many systems of programming that I think the people that develop them aught to slow down a bit. Programming languages seem to come and go like clothing fads.
  • Ye gads (Score:5, Insightful)

    by hey! ( 33014 ) on Wednesday August 10, 2005 @03:02PM (#13288192) Homepage Journal
    I won't discount the importance of Ajax and "RIAs" as a deployment model -- even as a kind of domain within in which system architectures could be grouped. But these aren't new programming models. We use the same old programming models to build new kinds of apps.

    Examples of Programming Models:
    0) Hardware based programming (plugboards etc)
    1) Stored program (program as data)
    2) Assembly programming
    3) High level language programming
    4) Structured
    5) Functional
    6) Object oriented
    7) Aspect oriented

    • don't forget Macro, interpreted or runtime modifiable.

      Personally I think that a model with extremal strong typing giving the data more controle over the execution flow. Think a jpeg image can also be a collection of bytes, and it can also be wrapped up in DRM and given rights. By DRM I mean digital rights management, where the data can be prevented from moving between users, increasing security of the system.
  • by under_score ( 65824 ) <mishkin@be[ ]ig.com ['rte' in gap]> on Wednesday August 10, 2005 @03:02PM (#13288193) Homepage
    One of the common anti-patterns is over-relying on tools and frameworks instead of inventing new programming models.

    Actually, he missed the anti-pattern. It's really: One of the common anti-patterns is over-relying on tools and frameworks and programming paradigms and processes instead of improving the skills and knowledge of the people doing the programming.

    I've been programming for a long time too, and I don't think that new programming models do all that much for productivity compared to finding good people or investing in improving the people you have. The recent Joel on Software article [joelonsoftware.com] discusses this at length. This is one of the big reasons I'm so interested in agile methods [agileadvice.com] and principles [agileaxioms.com].

  • by seafoodbuffet ( 527069 ) on Wednesday August 10, 2005 @03:02PM (#13288194)
    Rich Internet Applications are hardly the next "new" thing. The idea of doing asynchronous applications HTML/DHTML has been around since at least 1997. It's only the recent broad-based browser support that has led to the growth of AJAX, etc. However, trying to program an RIA that targets multiple browsers is like trying to write portable C code all over again. Thought CSS was screwed up between Firefox and IE? Try looking at the JavaScript implementation differences between the two platforms. Throw in a bit of Safari and Opera and you have all the makings of some super-gross client code.
  • Web as new platform (Score:3, Interesting)

    by Sv-Manowar ( 772313 ) on Wednesday August 10, 2005 @03:03PM (#13288200) Homepage Journal

    The trend towards RIA's/webapps has traditionally been restricted to those in a database centric role, but with the increasing use of AJAX and the like, the webapp is pushing further into the desktop application space. Obviously the centralization and server-side nature of the applications helps deployment and maintainance, but developers are basically trading the platform of an operating system for the platform of a web browser, with all the intricacies and compatibility issues that follow both.

    Webapps are a good direction to take for data access apps, but where the line becomes less clear cut and extreme amounts of javascript/dhtml are needed to achieve behaviours, the apps can become somewhat clunky and difficult to use. To me, it's essential that the designers of today's webapps realise the limitations of what they're working with and when to use traditional desktop apps.

  • by myowntrueself ( 607117 ) on Wednesday August 10, 2005 @03:04PM (#13288218)
    Well lets see now, programming metaphors for the modern age?

    Theres oil-oriented programming (everything is a pipeline), terror-oriented programming (everything is a suicide bomber) and dollar-oriented programming (everything has a mandatory dollar sign at the beginning), to name but a few.
  • by pjkundert ( 597719 ) on Wednesday August 10, 2005 @03:06PM (#13288230) Homepage
    For low level stuff, Lock-free and Wait-free algorithms are the next hot thing. For massively parallel systems, they provide levels of utilisation and efficiency that are un-reachable by using code involving locks.

    http://www.nwcpp.org/Downloads/2005/Lock-Free.pdf [nwcpp.org]

  • by Anonymous Coward on Wednesday August 10, 2005 @03:07PM (#13288240)
    What on earth? This article is tripe unfit for anyone but managers. He's put new buzzwords on the things he's describing here, but not one of them is actually new.

    First off, the "rich internet application" model he harps on is at this point about ten years old, since CGI programming first appeared. It hasn't changed that much since then. We figured out the idioms and patterns to make that work very quickly, and we've been using them since then. The only new development here is the "on rails" type stuff-- but that is nothing more, or less than the same model as all CGI has used, only now it runs faster. It is an optimization. Not anything new.

    Second off, what the hell is a "composite application"? Seriously? It sounds like he's just describing an application which embeds a client server model. Well lah de frickin dah. This is not new, this is not at ALL linked to "java and .net", and . We have some new and better tools for RPC-based programming, what with WDSL or WSDL or whatever and all these other new acronyms, but we're still doing the exact same thing in the exact same way that we were doing in the 80s with CORBA and Distributed Objects.

    If when this guy says "recent" he means "the last 20 years", then yes, that is a good coverage of the improvements in programming we have had since 1980. But since he seems to mean things a bit more recent than that, it looks like he's just playing the old analyst game of putting a new name on an old concept and pretending it's the most important thing ever. Unfortunately, giving something a buzzword isn't the same thing as inventing it.
  • by Paul Johnson ( 33553 ) on Wednesday August 10, 2005 @03:09PM (#13288250) Homepage
    See Haskell [haskell.org].

    Functional programming greatly simplifies the task of the programmer by removing execution order from the things that programmers have to keep track of. Just as garbage collection in Java got rid of the need to recycle memory manually, so in Haskell the execution order is a matter for the compiler to optimise rather than for the programmer to worry about.

    Historically functional programming has had problems doing IO: languages have had to admit impure side effects to do IO. Haskell has a wonderful solution to this problem, which unfortunately this post is too small to contain (really: go see!).

    Paul.

  • Good Design (Score:5, Insightful)

    by Tiger4 ( 840741 ) on Wednesday August 10, 2005 @03:10PM (#13288262)
    never goes out of fashion.

    Pick a good language/environment, even a not so good one, say C and a text editor, and then use some engineering discipline to really DESIGN THE DAMN application. Don't just throw features at it, don't just hack the code. Think about the real world problem you are supposedly trying to solve and work your way through it. Build it right, you don't have to worry about operation, maintenance, or longevity. Build it wrong, and you make a career of fixing it.

    Ooops, maybe I've stumbled onto the real secret of IT...

  • I guess when I think of 'models of programming' I think about things like Object Oriented or Functional programming categories. This article seems to confuse the idea of 'models of programming' with actual types of applications: desktop vs. Web apps or perhaps a fusion of the two. Now one could program either a desktop or web app (or an RIA) using either an Object-Oriented approach, declarative, functional or even a combination of them. Let's not confuse the application with the programming model (or pe
  • by starseeker ( 141897 ) * on Wednesday August 10, 2005 @03:14PM (#13288292) Homepage
    To my mind what we need is not more models, but some FINAL model - i.e. a way to impliment programming logic in such a way that it will never need to be implimented again.

    Think about it - how much programming out there is a duplication of some other effort, at least in some of its logical components? I'd say what we need is two things:

    a) A database of implimented programming logic - maybe not a database proper, but something that contains the ability to say "given this, do this" exists.

    b) A programming method that involves designing an application such that you break each top level logical component/ability down until you a) know that you have to impliment it or b) it is found to have already been done. I'm guessing b will be the norm, and as more and more logical components are added to the database the point at which b) is found should get higher and higher in the design stage.

    And the programming language bias should, at the database level, be a moot point. The database itself should define its algorithms and logic in such a way as to be workable in automatic proof assistants like acl2 and HOL4, and generate code in the required language as needed. Surely for a properly specified algorithm there must be some well defined way to generate it as code, provided the language specs are up to par. This is deterministic behavior, after all. Perhaps different algorithms for the same function can be added, and a choice made on a per language basis, but I'm dubious that this would be needed in an ideal world.

    In a world with open source as a working reality, there should never be a need to impliment anything non-trivial. Design should be specifying only things that don't already exist. Object oriented programming is a nice step in that direction, but that doesn't let people know a) what's out there and b) what the quality of it is. I say let's bring formal methods to their full potential, and reduce the amount of work the programmer must do to the irreducable minimum. Programmer time is too valuable to waste on re-implimenting things. Standardize everything that can be done "right", and have the human being do ONLY the part he/she is good at - deciding what needs to be done from a USER standpoint - i.e. WHAT to do. How to do it should be, as much as possible, decided once and correctly, and then not again.
    • by Coryoth ( 254751 ) on Wednesday August 10, 2005 @03:52PM (#13288552) Homepage Journal
      A programming method that involves designing an application such that you break each top level logical component/ability down until you a) know that you have to impliment it or b) it is found to have already been done.

      That already exists, and the specification is indeed amenable to proof tools (several specification languages use HOL as their proof assistant even!). Check out B-method, [fmnet.info]HasCASL [uni-bremen.de], SPARK [praxis-his.com], Extended ML [ed.ac.uk], or even Z [usingz.com] and VDM [ncl.ac.uk]. There are tools like Perfect Developer [eschertech.com]. There are specification extensions to Java like JML [iastate.edu] that support extended static checking and proof via other tools.

      Uptake has been slow, and the tools associated with this stuff are still maturing (despite the fact that formal specification is a relatively old field - tracing it's way back to Djikstra and Hoare in the late 60's). Doing specification properly tends to require a little more math background, and does take some work. More importantly, for a great many projects, it simply isn't suitable. There is no magic process you can follow that makes everything work, and there is no "final" programming model. There are whatever mix of techniques and models suit the project at hand. Good developers are ones who know lots of models and techniques and adapt them to best fit the problems at hand.

      That said, specification is sorely underrated and underused as a programming technique. Too few people are well acquainted with it, and almost all the complaints that often get raised are based on myths and misnomers. It's not right for everything, but there are plenty of places where perhaps it could and should be used. Knowing how to do proper formal specification is simply another weapon in a good developers arsenal, and I wish more people spent the little extra time required to learn something about it.

      Jedidiah.
  • While there are a lot of new technologies out there, they all have 'roots' in older technologies. Python is written in C for one example. IMHO, the next language breakthroughs will be developed with programmable hardware, a CPU that is a non-directed series of gates, which can be recombined to form whole new series of CPU's. With that kind of system a whole new construct of operations can be experimented with, and can be bound directly to the languages of the future. Functions become commands, commands beco
  • Continuations (Score:4, Insightful)

    by Masa ( 74401 ) on Wednesday August 10, 2005 @03:16PM (#13288306) Journal
    Functional programming [wikipedia.org] and continuations [wikipedia.org]. One present day example is the UnCommon Web [common-lisp.net], which is a web application framework implemented with continuations.
  • Typical Slashdot.. (Score:2, Insightful)

    by coronaride ( 222264 )
    Mod me down as flamebait for pointing this out, but did anyone else notice that the posted link for .NET went to the Mono homepage? Yeah, they deserve all of the credit for .NET. As a counterpoint, the Java link went to the Sun homepage...what's the deal?
  • by tarzeau ( 322206 ) <gurkan@aiei.ch> on Wednesday August 10, 2005 @03:31PM (#13288406) Homepage
    That counts.

    Let's have a look at programming languages http://www.linuks.mine.nu/gnustep/langs.txt [linuks.mine.nu]

    And an excerpt from a book (I can find you the title and ISBN if you want): Although both Objective-C and C++ derive from C, C++ is a systems-level language, whereas Objective-C is an applications-level language. The distinction can be summarized by saying that C++ was designed with program efficiency in mind, while Objective-C is geared more toward programmer efficiency. The difference is substantial--C++ is driven by a philosophy of efficiency and compatibility with existing C which, while necessary for a low-level language, proves quite restrictive in other contexts.

    And now, the almighty Allen-Booze study: Quote of the Booz-Allen Study

    * took 100+ senior programmers and trained them on NeXTstep, then asked them to write the same app on both NeXT and their previous system.
    * First application written was written 2 - 5 times faster.
    * Savings were 90%
    * 83% less lines of code in the NEXTstep version
    * 82% said NeXTstep was better in ALL categories
    * It isn't faster to code on NeXTstep; you just have to write less of it. The revolution is "getting rid of software".

    more about all this stuff, here: http://livecd.gnustep.org/ [gnustep.org]

  • by throbbingbrain.com ( 443482 ) on Wednesday August 10, 2005 @03:40PM (#13288464)
    Java doesn't cut it, primarily for [highly interactive user experiences] reasons
    Java will do everything the author wants but he completely discounts it because of some poorly coded Swing applet that crashed his 486 PC. Java will provide any user experience that a developer is capable of creating.

    • Mod parent up -- he's right!

      Java applets are ideal for RIA. Swing and Java2D provide everything a developer needs to make a world-class GUI; and technologies like Thinlet [sourceforge.net] have lots of potential too.

      But a JMS server on the back end and you have an ASYNCHRONOUS rich-internet application -- which is unheard of in other technologies like Flash and even Flex.

      People say "Java Applets are slow" and that may have been true 5 years ago, but machines are much faster now, and everyone I know have at least Java 1.4 in

    • Except the one that the user is actually used to: native widgets. SWT is the way to go, not swing. Swing will always be the emulation of the real thing. Still Java, but it might take some time for the GUI (toolkit) developers to realize this. Test Azureus (bittorrent client) and Eclipse (really good, free Java IDE) to get an idea. On Java 1.5 of course, any other Java runtime is (or, shoudl be ) history.
  • PLOP (Score:4, Interesting)

    by lheal ( 86013 ) <lheal1999NO@SPAMyahoo.com> on Wednesday August 10, 2005 @04:00PM (#13288616) Journal
    I'm pretty sure it will never be the rage, but I like Programming Language Oriented Programming for difficult problems that don't seem doable in C/++ or something similar.

    Most programs can be written practally in most languages, since all you really need is "if", "decrement" and "goto". Some problems aren't a good fit for a given language. That's why there's more than one.

    Any program that breaks its problem into chunks is in effect creating its own mini-language. Whether you call it Abstact Data Typing or Object Orientation or Functional Programming or even Top Down Design, what it comes down to is dividing the problem into manageable chunks and working with those chunks until done.

    I wish all CS students were taught from day one, or maybe day fifteen, how to create their own programming language. Usually you have to take a compilers course to get that.

    Creating a new language is not that hard. It gets a bad rap because people think they have to write a backend for a given architecture, but writing the backend to generate C++ or some other HLL is just as good, since they've already done the heavy lifting and you can automate the compile train with your favorite maker.
  • by ZorbaTHut ( 126196 ) on Wednesday August 10, 2005 @04:10PM (#13288683) Homepage
    nobody seems to be interested in developing.

    I program console games. We've got very strict RAM limits - from 384kb on the GBA to 64mb on the amazingly spacious XBox. (With some curious design decisions that can make it feel smaller than the 32+4mb PS2, but I digress.)

    On systems like this you've got to track pretty much every byte. One meg of garbage collector overhead means one meg you don't have available for useful stuff. I generally don't use standard dynamic allocation - at all - it's just too expensive. Maybe one big pool to load files into on the PS2 that can be cleared entirely between levels. Nothing like that on the GBA of course.

    As far as I can see, there's three languages that provide this necessary feature - ASM, C, and C++. So I use C++.

    I'd love to see an "improved" C++. But it seems like every time someone decides to improve C++, the first thing they do is tack on a garbage collector and get rid of direct memory access. And, you know, those are features I desperately need. Frequently those unwanted features are the only way I can even display graphics.

    And yes, it's possible to write modern games in languages with garbage collectors (as I understand it, the entire Jak and Daxter series was written in Lisp) but I know what lengths I go to to squeeze performance out of these systems - I really don't need a garbage-collected albatross hanging off my shoulder.

    And before anyone says "garbage collectors are faster than deallocating things manually!" - if I don't *allocate* anything, what makes you think I need *deallocation*? There is no heap. Move on.
  • by MilesParker ( 572840 ) on Wednesday August 10, 2005 @04:13PM (#13288709)
    The winner is...
    I think DSLs are going to radically change the way that people code. DSLs potentially provide the meta-prgramming ccapabilities of LISP with the transparency and idiot-proofing of a language like Java. We may even see a hierarchy of software engineeringh develop, with one type of hihg-level coder deveoping DSLs and others able to use these languages easily within their own areas of expertise. For more, check the following links:

    http://www.jetbrains.com/mps// [jetbrains.com]
    http://www.martinfowler.com/articles/languageWorkb ench.html [martinfowler.com]
    http://intentsoft.com/ [intentsoft.com]
  • by threaded ( 89367 ) on Wednesday August 10, 2005 @04:16PM (#13288733) Homepage
    What I like about these new programming models such as Ruby, Ruby on Rails etc. etc. is how much like Lisp they are.

    If you've never done a real programming course you've never been taught Lisp...

    Yippee, less bluffers in the pool, more fish for those who can hunt.
  • by DirkWessels ( 842453 ) on Wednesday August 10, 2005 @04:27PM (#13288833)
    All improvements I have seen the last 10 years in programming language have already been done in Smalltalk from the beginning.
    That is because everything is an object, even the programming constructs (like classes which are objects, and if/then which are called #ifTrue:ifFalse).

    The future languages might even be more dynamic, and include Lisp (or Hascell) like constructions that solve problems by defining the answer (functional and logic programming).
    Which is in the smalltalk-syntax: [i][:x| x*x=5.0] SolveFor: #x.[/i]


    While smalltalk (ST) is advanced, it also encounters the problem of managing 60,000 classes (or more). And everyone can see that simply grouping the classes in seperate modules does not help, which is done in Java. Even the Object-class should be redefinable, preferably on local level. There are some programs on top of ST that help a bit, but I would personnaly like to see it a bit better
    Another problem is that there are so many interfaces to different storages and systems. So we need C-interfaces, C++-interfaces, SQL-interfaces, XML-interfaces... etc..

    So any future programming model should have:
    - objects everywhere. (ST)
    - Be very simple and compact. (ST)
    - Easy to use and understand. (ST)
    - allow scripting (or runtime compilation) possibilities (ST)
    - easy modularizing of classes, methods and objects.
    - Allow distributed data and execution. (ST)
    - Allow easy interfaces to different storages and systems.
    - Integrate easily in the system

    Any future Object-system will be graphical and allow different programming models (logical, functional, procedural, storage, user-interface) to be build in graphical building-blocks..
    Already we can see some of this happening in:
    * XML-tools (data-definitions and interfacing),
    * visual-age (procedural program definition, ST again).
    * net-beans / delphi /visual-basic (user-interface) (skipped many ST examples)
    * web-tools (ruby-on-rails (ruby is based on ST), seaside (build in ST))
  • by juanescalante ( 848065 ) on Wednesday August 10, 2005 @06:17PM (#13289838) Journal
    I once read a paper written by Cristina Videira Lopes, a pioneer of aspect oriented programming, in it she stated that AOP is a significant breakthroug, but that the next step is to include elements of natural language in programming languages.

    She says that natural language is not suited to write computer programs, but it has powerful elements that can be useful in transferring ideas more closely to the way we think. An example of such elements are temporal references such as before and after.

    You can read the abstract to one of her papers here [acm.org]. Very interesting stuff.
  • by owlstead ( 636356 ) on Wednesday August 10, 2005 @06:55PM (#13290165)
    These will be more important than any programming language. The way Java or .NET handle components should be an eye opener. What you want is code you can control, what does what you expect it to do.

    On the runtime part:
    - plugins (see Eclipse and OSGi technology)
    - assemblies/libraries (see .NET framework)
    - VM support (garbage collection, overflow handling, exception handling, bounds checking etc.)
    - runtime information (reflection)
    - supporting components (application servers, message services)

    On the IDE part:
    - parsing editors (see Eclipse)
    - code analyzers (PMD)
    - semantic links from code to design tools (needs a parsing editor to function best)
    - unit testing

    I see a mayor shift towards runtime technologies coming up ahead. I can see more flexibility coming up in how programs are run and objects are used. Compilers are already running in the background to use Java both as script and as compile time language, for instance. Java may be to strict on some issues however.

    For programs, components, OO and the imperative model will probably be here to stay. Other languages will be used for their respective domains, but the language wars seem to be over for now (as each programming language looks more and more like its siblings). Lets focus on the runtime and supportive technologies. And getting the things running reliably, for crying out loud.

    I don't think using multiple languages that try to accomplish the same thing is such a good idea (see .NET C++, C#, VB7 and J#). You end up learning all of them (see MSDN). Mixing with languages that use other programming paradigms could be usefull though.

    And yes, this is also an opinion piece, as is the parent.
  • easy (Score:3, Funny)

    by geekoid ( 135745 ) <dadinportlandNO@SPAMyahoo.com> on Wednesday August 10, 2005 @08:20PM (#13290707) Homepage Journal
    BEPL
  • Pendulum (Score:3, Interesting)

    by Tablizer ( 95088 ) on Thursday August 11, 2005 @12:02AM (#13291814) Journal
    I see a pattern of shifting back and forth between data-oriented techniques and behavior-oriented techniques. I personally prefer data-oriented techniques, and enjoy "collection-oriented programming" using concepts from the likes of relational and APL's step-children languages.

    OOP is perhaps the pennicle of behavior-oriented in that interfaces tend to be thought of as behaviors applies to things. It is about time we swing back to the data side for a while.

    I would also like to see more exploration of "separate meaning from presentation" such that syntax or views of logic can be customized to the developer and/or a particular need. Microsoft's CLR (or is it CRL?) is a baby step in that direction because it allows multiple syntaxes on top of more or less the same interpreter.
               
  • missing acronyms (Score:3, Informative)

    by anomalous cohort ( 704239 ) on Thursday August 11, 2005 @05:55PM (#13298698) Homepage Journal

    With this topic, I expected discussions on such technologies as MDSD [mdsd.info], MDA [omg.org], or AOP [aosd.net] yet there is no mention of these here. Does everyone here consider them to be DOA?

Beware of Programmers who carry screwdrivers. -- Leonard Brandwein

Working...