Slashdot is powered by your submissions, so send in your scoop


Forgot your password?

Ask Slashdot: What's New In Legacy Languages? 247

First time accepted submitter liquiddark writes "I was listening to a younger coworker talk to someone the other day about legacy technologies, and he mentioned .NET as a specific example. It got me thinking — what technologies are passing from the upstart and/or mainstream phases into the world of legacy technology? What tech are you working with now that you hope to retire in the next few years? What will you replace it with?"
This discussion has been archived. No new comments can be posted.

Ask Slashdot: What's New In Legacy Languages?

Comments Filter:
  • by Alexander Karelas ( 3477949 ) on Saturday March 08, 2014 @05:13PM (#46436099)
    Perl rocks (Mojolicious, AnyEvent, Moose)
  • by innocent_white_lamb ( 151825 ) on Saturday March 08, 2014 @05:19PM (#46436121)

    I've been playing with computers since the mid-70's and one of the things that I did early on was learn to program in C.

    One of the smartest things I've ever done; it's up there with my decision to start running Linux in the late 90's.

    If you can program in C you can write a program that runs on pretty much everything that you'll come across that you might want to program.

    Learn C if you want to learn a programming language that you can use for a very long time.

    I like Android, got an Android phone and a couple of tablets, but the C NDK doesn't allow you to do things without having to jump through a bunch of Java hoops to get there. I would have more Android devices if it was easier to write a program on it in C.


    (Hip hip array!)

    • Re: (Score:2, Insightful)

      by ThorGod ( 456163 )

      Have you considered Objective-C for mobile development...

    • If your goal is to build Android command-line apps, you can do it with the NDK by modifying your like this:

      include $(CLEAR_VARS)
      LOCAL_MODULE := progName
      LOCAL_SRC_FILES := $(wildcard $(BASE_SRC_DIR)/*.c)

      include $(BUILD_EXECUTABLE)

      The last line there is the crucial line (but it's not documented).
      • Re: (Score:2, Informative)

        Actually, I would love to find a method for programming Android in C and interacting with the user through webkit. That way I could create an app entirely in C with a html/javascript frontend to interact with the user without having to horse around with Java to get a usable app.

        Unfortunately, I haven't found a way to do that yet. (If someone here knows how to do that, by all means sing out!)

        The irony of mobile computing is how bloody difficult it is to write a simple C program to run on one of those thing

        • by LesFerg ( 452838 )

          I'm not a regular C++ programmer or user of Qt, but as a casual observer it seems that the mobile/embedded APIs in Qt 5.2 could provide a fresh new approach for Android and other mobile platforms. However they are steering clear of easy webkit use, altho there are ways to fiddle around with JNI to get a web interface of some sort, and there are hints at Qt WebEngine (based on the Chrome engine) maybe being available in Qt 5.3 (or only in the enterprise version?). Blogs and new releases seem to vary.

          I am a l

        • There's something similar to that which I've done. I wrote a webserver in C, then ran it as a service from the command line on the Android device, attached to localhost. Then the user opened a browser to http://localhost:38456/ [localhost] or whatever.

          You could do something similar, bundle the C browser with your app, and have your app launch the browser when the app starts (the easiest way would probably be to make a JNI 'startBrowser()' function or something). Make your Java UI essentially a WebView object that op
          • nobody needs to write their own webserver, try Civetweb [] - the MIT licenced version of mongoose - that is small (at 180k) and gives you a web server embedded in your program in 3 lines of code. It works on Android too.

    • If you can program in C you can write a program that runs on pretty much everything that you'll come across that you might want to program.
      I would have more Android devices if it was easier to write a program on it in C.

      So C is not that useful for modern platforms, after all?

  • by msobkow ( 48369 ) on Saturday March 08, 2014 @05:21PM (#46436131) Homepage Journal

    Apparently the writer of the original article thinks "legacy" means that you have to maintain and enhance existing applications instead of developing new ones.

    To me, "legacy" means that there are no new applications being developed in that language, and the only jobs available for it are maintaining and enhancing existing applications. .Net and Java are certainly not "legacy" in that sense by any stretch of the imagination.

    • by Art3x ( 973401 ) on Saturday March 08, 2014 @09:33PM (#46437265)

      "Legacy" is a buzzword for "old."

      Multisyllabic and euphemistic, I'm sure it first came into being from the lips of an advertiser.

      But if you want to think, write, and reason clearly about a subject, stick to the old, short words, the ones that your mind retranslates the words to anyway after hearing them.

      • "Legacy" is a buzzword for "old."

        Pretty much. In programmer-speak, "legacy code" is anything that was already there when you started working on the project.

        So if somebody else was writing the program last month, but they left and your team started working on it this week, anything already in the project is "legacy code". That's basically what the word "legacy" means: something you "inherited" from someone else. (Or in this case, some other project, coder, or team.)

        Some people have started using "legacy" to mean "old", but that's real

        • I personally would say legacy code is code that is planned to be refactored/replaced.

          I write plenty of code, first time, which is designed to last... until it NEEDS replacing.

  • by PhrostyMcByte ( 589271 ) <> on Saturday March 08, 2014 @05:22PM (#46436135) Homepage

    Certainly not all of .NET -- as a whole it's anything but legacy and evolves at a fairly rapid pace -- but it also includes a lot of old cruft and a few poor design choices that affect even modern code.

    Other legacy tech I'd love to get rid of: SSIS/SSRS -- terrible SQL Server drag-and-drop technologies that do a lot of stuff badly. 1D barcodes like Code 39 and Code 128 -- dead simple to implement but take up a lot of space and are prone to poor reads.

    • by LesFerg ( 452838 )

      And yet employers seem to discriminate heavily against people who have not been working with the latest version of .Net, and expect us to pass tests on the most obscure and arcane features of .Net 4.5, many of which as far as I can tell, will probably never be required in basic web solutions anyway.

      Oh, and I didn't get a particular job because I didn't have SSRS experience! Laughed my arse off at that one.

      • Really? Sheesh. I have SSRS experience, and I'm laughing my ass off at that. "SQL wrapped in XML" is not exactly an arcane skill....
        • by siride ( 974284 )

          No, it's arcane. You have to learn all the workarounds for the various gotchas, misfeatures and outright bad designs in SS*S. It is dark sorcery and I hate it, and having to report to the users that "no, I can't do that because SS*S is a pile of shit".

  • Important question (Score:5, Interesting)

    by DoofusOfDeath ( 636671 ) on Saturday March 08, 2014 @05:24PM (#46436149)

    I'm mid-career, and this question feels very relevant to me. For the past 15 years or so, I've focused on cross-platform (mostly Linux) C++ programming, with a decent SQL understanding as well.

    But lately, the market for straight-up C++ Linux jobs seems to be waning. For one thing, more dynamic / introspective languages such as Java, Python, Javascript, and C# seem far more prevalent than my loved/hated C++.

    But a bigger trend seems to be a shift to frameworks, rather than languages + OS's, as the focus of job-posting requirements. It's no longer true that C++/Java + Linux + SQL experience lets me quickly find a well-paying job in the city of my choice. For that, one also (or instead?) needs competence in something like Hadoop, Cassandra, Ruby on Rails, Amazon Web Services, Django, etc.

    This makes sense, as CPU's get faster, and as software development continues shifting to a more connected, more web-oriented world. But it's a little scary for someone like me, whose day job doesn't offer much opportunity to work with those newer frameworks in order to develop a marketable level of skill.

    I'm guessing this is a periodic problem. Ace mainframe programmers found themselves less marketable when desktop development became popular. Desktop developers were partially outmoded by client/server, and have a more serious marketability problem now that most development is aimed at (ultimately) web browsers. It's not that serious C++/Linux/back-end geeks like me can't find work, it's just that I(we) feel a little trapped compared to those whose skills are currently in broader demand.

    I guess the question, then, is do I(we) double-down on our current expertise and become indispensable in a small fraction of the job market, or do we accept the pros and cons of partially re-inventing our careers (and setting back our salaries) to retool?

    • my good man...your confusing what's trendy with what's tried and true.

      with 15 years C++/OOP/SQL experience under your belt, you would no NONE problems finding a tons of really well paying jobs in your city of choice.

      • its true - I was hired for a C++ on Linux job replacing PHP web stuff with C++ web services.

        then I got there are found it was all C#/WPF bollocks. Fair enough, I gave it a go and it was pretty damn easy, even if WCF is a bloated heap of steaming..

        But when I queried my manager about it, he said he knew a C++ dev could do the job. He was right - knowing C++ meant I knew what the intellisense-driven C# stuff was all about, how it performed and how to get the best out of it, and sort out the nasty bugs that we

    • by houstonbofh ( 602064 ) on Saturday March 08, 2014 @05:52PM (#46436323)

      I guess the question, then, is do I(we) double-down on our current expertise and become indispensable in a small fraction of the job market, or do we accept the pros and cons of partially re-inventing our careers (and setting back our salaries) to retool?

      As someone who has been in the field for about 30 years now, and can still easily find work in spite of how everyone claims IT is ageist... I think I can answer this for you long term.

      Always be learning.

      Seriously, you always need to be re-inventing yourself, studying and working to stay at the edge of the curve. Whatever is being done now will turn old (and then new again and then old again...) and you better have some way of dealing with it.
      However, you do not need to take a pay cut to do so. Start learning Ruby on a personal project. Then get a side gig converting some existing applications to web enabled ruby. Now you have the creds to demand a hell of a lot more than the 20 year old ruby guy because you can actually understand what the have, and fuse them together. All the new guy can do it burn it all down and start over. This has real value to a lot of businesses.

      And the fact that at 40+ you have a long list of skills but have stayed current with the latest stuff as well really makes you stand out on your resume.

    • Wait, you're saying there was a market for C++ Linux jobs at some point? If you want to use Linux and be the most marketable as a developer you develop in Java (with a little JavaScript sprinkled in). If you want to stay more low-level have you looked at getting into the embedded space?

    • The jobs are shifting to introspective languages because the way people work with computers is shifting from the desktop to the web. It only tangentially has anything to do with the speed of computers. There's just not as much call for desktop programs anymore because the shift has moved to a networked world that isn't tied to a desktop machine running (OS-whatever). My guess is that you'll still have a job in desktop apps programming in C++ for 20 years at least, but the world will change under your fee

  • abaci (Score:4, Funny)

    by hirundo ( 221676 ) on Saturday March 08, 2014 @05:25PM (#46436161)

    My old abacus is giving me splinters. I asked my boss for a new one and he said "cào n zzng shíb dài". I'm not sure what that means but I'm hopeful.

    • Don't knock an abacus! My abacus lives beside my main computer.

      You really can't bet an abacus for doing binary arithmetic and bit shifting. I don't know about you, but I can't visualize that stuff well in my head so I either grab my abacus or start making slash marks on a piece of paper, and it's a lot more efficient and conducive to thinking when I do it with an abacus.

      • by LesFerg ( 452838 )

        You really can't bet an abacus for doing binary arithmetic and bit shifting.

        But they don't help your speling at all.

    • by LesFerg ( 452838 )

      My old abacus is giving me splinters. I asked my boss for a new one and he said "cào n zzng shíb dài". I'm not sure what that means but I'm hopeful.

      Well the last part was something about a goat, and the first part was something to do with a broom handle, so maybe your boss was explaining the relative trade value of your equipment requirements.

  • Why .Net? (Score:5, Insightful)

    by Richard_at_work ( 517087 ) <richardprice@gm a i l . com> on Saturday March 08, 2014 @05:26PM (#46436169)

    Why was .Net mentioned as a legacy technology? Its actively developed, has a decent community, and is widely used - apart from some poor articles and conclusions that were jumped to here on /. There is no reason to consider .Net a legacy technology.

    What is more likely is that the person was referring to projects that are stuck on specific versions in maintenance hell, which can happen with any language - Ive been stuck with VB.Net 2.0 WebForms projects, while at the same time I've been using MVC 4 and .Net 4.0. One I would consider a legacy project, the other not, but both use the same line of tech.

    • I think the problem comes from living in a bubble. We all live in a bubble and think of the reality around us being the reality for everyone else. It's not until you step outside of the bubble do you realise the assumptions ions aren't necessarily true. What will often be the case is different people solving different problems with different languages. Sometimes it's down to the suitability of the language, sometimes it's down to the local skill set and sometimes down to what's considered to be the latest

    • I think the person the questioner was talking to was a tad out of touch. And we see that regularly on Slashdot with people absolutely convinced that $TECHNOLOGY is never used because they don't see it used in their circle of technology acquaintances.

      The four most commonly used platforms right now are LAMP with PHP (not Perl, not Python, Goddamnedfuckingawful PHP), JEE, .NET, and "front-end web (Javascript/HTML/DOM/CSS)". Coming up the rear are Objective C on iOS, Java on Android, and Native (non-.NET) C+

  • I remember the days getting started in our computer reuse / recycling business, that we had to boot PCs with "Caldera DOS" and I had to reprimand people for using MS-DOS (MS was threatening a lot of piracy enforcement vs. DOS, even in 2003). The staff looked looked at me like I was the silliest man in the world.
  • by DaveAtFraud ( 460127 ) on Saturday March 08, 2014 @06:42PM (#46436583) Homepage Journal

    I'm still waiting for FORTRAN to make a comeback. And none of this sissy FORTRAN 77 or FORTRAM 95 stuff either; real FORTRAN IV. If I wanted to program in something that looks like PL/1, I'd program in PL/1.


    • Fortran never went away. It's still used in a lot of engineering and scientific work.
    • you're a pussy. use freakin ALGOL

    • by sartin ( 238198 )

      I'm still waiting for FORTRAN to make a comeback. And none of this sissy FORTRAN 77 or FORTRAM 95 stuff either; real FORTRAN IV.

      Yay for old timers!

      I once worked in a shop that used RATFOR []. One of my cow-orkers took great pride in the fact that his code passed through the preprocessor unscathed.

  • by Vellmont ( 569020 ) on Saturday March 08, 2014 @08:02PM (#46436951) Homepage

    It's not of course, but a man can dream can't he? .net isn't dying by any stretch of the imagination. But let's start with languages most people would agree ARE legacy languages:

    COBOL (if you can't agree on this, end of conversation)
    various assembly languages (maaaaybe the 68000 family?)
    FORTRAN (starting to get controversial here since I know it's still used by some crazy science people who don't want to learn anything modern)

    I was about to add Pascal... but then noticed some crazy person is still developing Pascal in the form of freaking Delphi, and even has a port for Android phone. WTF?

    So that makes me think... if I can't include Pascal, or possibly even FORTRAN, languages I've never known someone to write code for in the past 15 years, but yet there's still new releases of it in legacy languages... then what can I include? I'm sure some nutter will try to argue with me that Forth is still a viable language. COBOL.... just go away.

    The better question is more likely, which languages should you really not put your career prospects on? Personally I'd list any of the above languages, but sadly not yet PHP.

    • by mcrbids ( 148650 )

      As a long-time PHP dev, I recognize that it's very popular to hate on PHP, and has been for some time. And there are some valid criticisms of PHP, particularly from the domain of purity. PHP is a brutish language, with lots of warts. Whether it's the lack of any sort of parallelism or threads, or the random_underscores or the random(haystack, needle) ordering of variables in functions, there's plenty to complain about.

      But PHP has its strengths, too. Its translation of strings to integers to hexidecimal numb

      • by dkf ( 304284 )

        And there are some valid criticisms of PHP, particularly from the domain of purity.

        The major problems that most people have with PHP stem from the metric buttload of problems with SQL injection and XSS bugs that they're infested with. I know these are not the language's fault exactly, but it's usually very close to the epicenter of trouble and pain. I suspect that it's the legacy of poor community practice that's the biggest troubling thing in reality (as opposed to what people perceive), and that's very hard to fix; all those badly written applications and tutorials need work.

    • FORTRAN (starting to get controversial here since I know it's still used by some crazy science people who don't want to learn anything modern)

      The latest extant standard is 2008, with the new one due out next year. Fortran 77 would count as very much legacy, but the new fortran variants seem to be decent enough modern languages.

  • We analysed a legacy application in C# two years ago. That does not mean that C# oder .Net are legacy, but that there are software systems implemented with these technologies which are legacy and subject to modernization.

  • Visual Basic, please let it be Visual Basic that's going away.
  • by russotto ( 537200 ) on Saturday March 08, 2014 @09:54PM (#46437335) Journal

    Javascript. PHP. Java and every stinking overarchitected hole of a framework built on top of it. C++ and it's various internal metalanguages. Anything which has appeared on more than 10% of the print-it-out-and-its-good-for-toilet-paper job ads on Dice. All the functional languages, which exist mostly to make people who know them feel superior. Go, because we didn't like Algol-68 the first time around.

    I think we should just go back to counting on our fingers. From that first abacus, we were doomed.

  • Legacy properly describes a software system, not a language. Languages rise and fall in popularity. Sometimes a language has inherent limits, sometimes the implementation stinks, sometimes the syntax or paradigm no longer become fashionable. Sometimes languages and platforms disappear only to re-emerge years later. Back in the late 1990's NeXTSTEP/OPENSTEP was turning into a "legacy platform" ... yet today MacOSX and iOS rely on Objective-C and descendants of the NeXT APIs. Even if a language fades com

  • by Junta ( 36770 ) on Sunday March 09, 2014 @12:08AM (#46437767)

    Though I'm not usually dealing with Microsoft platforms, I have enough experience with it to consider classifying it as 'legacy' in any sort of universal way an odd proposition. It is after all *the* first-party supported development framework for Microsoft platforms, very much continuing to be supported and developed by a pretty important market force (like it or not).

    Of course 'Legacy' is mostly in the eye of the beholder. About the only place 'Legacy' seems to have unambiguous meaning is within a single development organization replacing/phasing out projects they control. COBOL continues to see pretty significant deployments and is actively being enhanced, though most people in the industry would consider that 'Legacy'. Similar story for Fortran. A number of languages that don't get so much 'glory' these days continue to play important roles in particular segments and continue to be developed. There are those that would consider PHP 'legacy' and others just moving onto the platform. If you try to name a platform that by popular opinion is almost certainly totally 'Legacy' you'll probably discover not only some groups doing new development in the language, but some companies or projects actually continuing to enhance the language for others. Basically, if you can remember it, it by some definition is probably still alive.

  • I would not call it "legacy". You coworker is just a hipster.

  • You know, some people still have to write real applications.

    There's a whole crop of big internal LOB web applications that are showing serious aging problems since they only work with older browser types. As they're being chucked at great cost, it's been getting easier and easier to convince people that desktop apps are better - you get a higher quality, more stable UI, and they don't just randomly break with browser upgrades.

    I wrote a fairly large (100KLOC+) LOB .NET application in 2002, and it's still
  • Back in the early 1990s we wrote some software in Borland C++ to run under Windows 95 on an industrial PC to do some measurements and transfer data via an RS232 link and installed these boxes at various transformer stations at a utility. We've managed to "update" the setup to run under Windows XP but the utility didn't want too many changes for new installations so it was still compiled under a late 1990s version of the Borland C++. It is fortunate that we also bought the source code for the Borland RS232 d

"Conversion, fastidious Goddess, loves blood better than brick, and feasts most subtly on the human will." -- Virginia Woolf, "Mrs. Dalloway"