Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Education Programming

Ask Slashdot: How Will You Be Programming In a Decade? (cheney.net) 279

An anonymous reader writes: Programmer Dave Cheney raised an interesting question today: How will you be programming in a decade? If you look back to a decade ago, you can see some huge shifts in the software industry. This includes the rise of smartphones, ubiquitous cloud infrastructure, and containers. We've also seen an explosion of special-purpose libraries and environments, many with an emphasis on networking and scaling. At the same time, we still have a ton of people writing Java and C and Python. Some programmers have jumped headfirst into new tools like Light Table, while others are still quite happy with Emacs. So, programmers of Slashdot, I ask you: How do you think your work (or play) will change in the next ten years?
This discussion has been archived. No new comments can be posted.

Ask Slashdot: How Will You Be Programming In a Decade?

Comments Filter:
  • Easy. (Score:5, Funny)

    by hey! ( 33014 ) on Monday December 07, 2015 @12:04PM (#51073127) Homepage Journal

    With a gesture-based interface connected to my fishing rod.

    • Re:Easy. (Score:4, Interesting)

      by ranton ( 36917 ) on Monday December 07, 2015 @12:25PM (#51073267)

      I do see software development roles split between people still writing code and people using graphical (perhaps gesture based) interfaces designing workflows, approval processes, user interfaces, etc. Not sure how fishing rods factor in.

      I think my recent work with Salesforce has given a good glimpse of the future of software development, at least in the next decade or two that is. 90% of the work I would have done a decade ago is now handled by a third party platform, and I just work on the few things that need to be custom. That has been attempted by SAAS vendors before (even before it was called that), but never as well as Salesforce has done it. There is plenty of room for improvement, but their software gives an idea of what can be accomplished. Though I hope someone else beats out Salesforce's Force.com platform with something that is more engineering focused instead of sales/marketing focused.

      I see the software development industry breaking up into tiers like most other industries. Similar to engineering where you have engineers and you have CAD operators (among other roles). I see elite software engineers making much more money than they do now, but a class of programmers making wages closer to CAD operators (although a bit more) becoming the norm for most programmers. Overall it will let the industry create more software with less costs.

      • Re:Easy. (Score:4, Insightful)

        by PolygamousRanchKid ( 1290638 ) on Monday December 07, 2015 @01:02PM (#51073573)

        Not sure how fishing rods factor in.

        Fishing rod == retirement. Or at least so I'm guessing.

        I'll be retired in a decade, so I'll just be programming for fun. That's how.

        • by VAXcat ( 674775 )
          HA! I'll be retiring in 233 days...then I'll be writing kernel mode code in MACRO for my collection of PDP11s.
      • It was assembler first. Then FORTRAN and BASIC and assembler. Then assembler and C. Then C and Perl. Then C and Python.

        It's been C and Python ever since.

        The shift from assembler was forced on me because the underlying platforms began to diverge; C took care of that, while remaining low level enough not to suffer the slings and arrows of clunk, lethargy, and various types of safety nets of a hoop-jumping nature.

        Perl put a moderate amount of readily accessible speed and a great deal of power on the table. Tha

        • by mikael ( 484 )

          The thing about C++ is that you can't just learn the C++ to be relevant to be employers. That knowledge has to include at least STL if not Boost as well as some GUI (Qt) and multi-threading (Intel TBB). Other companies may have moved onto using Python with PyQT and taken parallel processing up to PyCUDA.

      • I don't see how this is different from the past where most programmers "coded" in RPG, or in Delphi, or in Powerbuilder, or in MS Access.

        • by ranton ( 36917 )

          I don't see how this is different from the past where most programmers "coded" in RPG, or in Delphi, or in Powerbuilder, or in MS Access.

          Not much different. Just like the iPhone wasn't much different than the Newton. But at a certain point technologies become mature enough they can deliver on the hype that has built up for decades.

          What tools like Powerbuilder and Access were not good at, IMHO, is they weren't built upon an enterprise grade infrastructure. Perhaps I am not being fair to Powerbuilder since I only worked on one project with it and as a junior developer, but it didn't seem as extensible as even a VB application. All I know is ou

    • Correct question is - How would you be programmed in a decade...

    • Capture a mermaid and update your Facebook status with a single gesture of your fishing rod.
    • by rwa2 ( 4391 ) *

      With a gesture-based interface connected to my fishing rod.

      Is that a euphemism for something? Yeah, I think I've seen that interface in Sex & Zen 2

    • Now I get it, you mean that rod.

    • With a gesture-based interface connected to my fishing rod.

      Maybe not that, but I'd love it if we had something where no knowledge of programming languages were needed, and it was mostly a case of click, drag, drop... I recall that NEXTSTEP had something like it, and to an extent, so did C++Builder, JBuilder and TurboPascal from Borland. Something that could be easily generated would be fantastic!!!

      • Re:Easy. (Score:4, Insightful)

        by hey! ( 33014 ) on Monday December 07, 2015 @05:20PM (#51076139) Homepage Journal

        Well, as you probably guessed I've been around a long time, and this idea comes up over and over again, and it never takes off, and for a good reason. Programming is hard; it's deeply tied to logical reasoning, which in turn is tied to language and notation. Having visual representations as an adjunct often does make reasoning easier, but having only visual representations does not.

        Through the years I've met a number of people who claim to be "visual thinkers", but in fact I don't think most people who make that claim are particularly good at visual thinking. What they really mean is they want things kept simple so they don't have to work that hard; when confronted with visual subtlety or complexity they're just as lost as when they are confronted with linguistic complexity. Basically they're mentally lazy but prefer to think of themselves as misunderstood.

        Now there are people who are great visual thinkers. Any decent graphic designer is bound to be a strong visual thinker. But oddly enough it's not graphic designers who make this claim. It's usually managers who don't have the patience to read through pages of text; but they don't have the patience to wade through pages of diagrams, either.

  • by halivar ( 535827 ) <`moc.liamg' `ta' `reglefb'> on Monday December 07, 2015 @12:06PM (#51073131)

    Learning new languages every six months in a young man's game. As I get older, I will gravitate towards jobs where I can leverage 15+ years experience in a language to get better-paying positions.

    • Re: (Score:2, Interesting)

      by Anonymous Coward

      Going forward more and more it's not even about the language so much as the systems. Everyone can learn java and pick up the tool stack, but the domain knowledge and systems experience and hell just knowing the right contacts becomes very valuable in long term industries (aerospace, defense, medical, etc).

      As long as you pick something that doesn't get entirely replaced, it's a good way to spend the last 15 or so years of your career, and even if it does get entirely replaced, they're gonna need a lot of tha

    • Re: (Score:2, Interesting)

      by Anonymous Coward

      It's also an idiots distinction. There are very few 'new' languages. Most of them are just syntactic changes that integrate more or less of the C++ standard library so that you can do a particular task with less boiler plate. There is nothing wrong with that, but I get annoyed when fanbois keeps going on about how 'new' languages are somehow revolutionary.

      It is like the whole functional programming thing. I spent a while working my way through blogs going on about how amazing functional programming is, whil

      • by Anonymous Coward on Monday December 07, 2015 @12:35PM (#51073355)

        Easy enough to move from language to language, but toolstack to toolstack less so. If you've used c++ you can "learn java" very quickly, but learning the increasingly complex libraries and frameworks that tend to accompany it can take awhile. Even if you've worked with similar tools, it can take awhile to learn all the best practices and shortcuts and little nuances.

        It even extends beyond programming itself. Methodologies change and the toolstack used to implement those methodologies changes with them. We've generally migrated from bug trackers (bugzilla, mantis, etc) to project trackers (trac, redmine), and chances are in a few years we'll be doing something else.

        People joke about old men stuck in their way, but as I get older I kinda get it. After a few iterations my enthusiasm to learn the next greatest thing has waned, and it feels like something I have to do rather than something I want to do, and the gain starts to feel less worth it. Is gradle really that much better than maven? Was maven really that much better than ant+ivy? Once I become a gradle guru, something is just gonna come up and replace it as the defacto, so why even bother?

        The only solution is to become a manager and become the roadblock we all hated when we first started.

        • You are failing to learn from history.

          In the 1970's we learned a new language every month. Programmers were expected to read the manual - all 10 pages of it - and then start using it.

          Currently, you are expected to have five years experience of software released 6 months ago.

          In 10 years time, you will be expected to have 50 years experience of a product on the day it is released.

          OR ...

          still be writing PHP5 by throwing virtual cow-pats at the virtual (server) farm on the screen with your Wiimote, as

          • LOL "all 10 pages of it"

            https://en.wikipedia.org/wiki/... [wikipedia.org]

            Sorry kiddo, your grandpa was off his meds when he told that story.

            • What ten pages? I learned C without that book. You can learn C well from a cheat sheet one page long if you already know another low-to-medium level programming language; well enough to read and write simple programs, you'll certainly need to learn more for more complex stuff (pointer arithmetic). But almost every language is mastered by learning just a little bit at first, then a little more later, then a little more, etc.

              I had a boss once who learned C in 21 days, he had the book on his shelf as proof.

      • by angel'o'sphere ( 80593 ) <{ed.rotnemoo} {ta} {redienhcs.olegna}> on Monday December 07, 2015 @02:31PM (#51074535) Journal

        until I came across a guy who had come to it from C and was like 'yeah, so basically callback functions with a loose stack implementation
        Except that functional programming is much more than that ...

        The C++ analogy would be: have a class with overloaded operator(), its "objects" then behave as functions. You can return such functions from ... erm ... functions.

        In real functional languages you can compose new functions on the fly and either use them as parameters, or result types or apply them to arguments.

        And, for the C crowd: a function is not an address to a piece of code you call, it is more a piece of "data" you allocate with malloc and interpret later. Or in C++: it is a so called "first class citizen" like a class or a struct.

    • Learned Perl over Columbus Day weekend in 1992 as an E-4 in the Air Force; still using it today, for contracted and open-source projects.

    • by Rob Y. ( 110975 ) on Monday December 07, 2015 @12:43PM (#51073413)

      It's not the language at all. It's the way to structure an application that's changed drastically. I used to write server-based apps, with a smart terminal front end. It made for a nice, simple, supportable structure, with a reasonable GUI. Recently, I've delved into web programming. Javascript is fine as languages go - though the various libraries built around it are probably more difficult to get a handle on. But the main surprise is what has come to constitute an application. To the extent that there's an application, per se, it consists of Javascript code in the browser, with data accessed via services on the back end. And mostly on a single page basis. In other words, the surprise is that there's no module that counts as an overarching 'application' that defines a structure encompassing a large set of functionality. I have no idea how this structure would scale up beyond a small set of web pages. Not scale in terms of being able to support a large number of users, but in terms of anybody knowing (or remembering) how all the bits of code fit together.

      • As I explain in my previous comment, [slashdot.org] the most recent changes in structure are being driven by theoretical advances in Category Theory (the "abstract nonsense" that brought us monads and LINQ).

        This means that they are particularly well adapted, as Category Theory is the science of composing small parts to build a large structure without scaling problems. In theory, programs using these techniques should be easier to understand and maintain, at least once that you get a preliminary grasp of the underpinnings

    • by myrdos2 ( 989497 )

      Agreed. Ten years ago I was programming mostly in C++ and C#. Ten years from now I'll be programming mostly in C++ and C#. While our shop has lots of different user interface platforms for the same product, ranging from PowerScript to XAML to HTML5, I don't see the core code changing in ten years. It just gets to wear different clothes, according to the style at the time.

    • by hey! ( 33014 )

      I dunno. Learning a new language is easy. It's the APIs they come with that's a bitch.

  • by Anonymous Coward
    It will all be done either in India or via automation.
  • by crow_t_robot ( 528562 ) on Monday December 07, 2015 @12:09PM (#51073151)
    I'll be programming the same except using Cherry MX periwinkle switches instead of the current blue ones I have now.
    • by Tablizer ( 95088 ) on Monday December 07, 2015 @12:51PM (#51073473) Journal

      You are behind, dude, Cherry MX Periwinkle Switches Reloaded++ is now out.

      Seriously, who the hell knows what's 10 years down the road. The industry is driven as much by fads as logic, if not more.

      I just hope the UI side simplifies so that one doesn't have to say diddle with the minutia of scroll-bar coordinates for everyday GUI idioms and bread-and-butter CRUD. I'd like to focus on domain logic rather than micromanage UI glitches all day.

      UI's are f8cking mess unless you target a specific browser brand and version. We devolved from the desktop days. I pray the industry cleans up the UI mess created by the browser. Unfortunately the industry seems to be chasing eye candy fads instead instead of practical things, but I guess the money is in hype and flash.

      In summary, get off my UI lawn!

      • by RabidReindeer ( 2625839 ) on Monday December 07, 2015 @02:39PM (#51074591)

        We devolved from the desktop days.

        Oh yes. One of the worst things that browsers did was virtually destroy the ability to use shortcut keys to do useful work instead of having to grab mouse and irritate carpal tunnels. All the shortcut keys now either do nothing or control the browser, not the app in the browser.

        Plus far too many webpage authors don't leverage what few amenities we could have. For example, how many form-based pages have you visited where there's a preselected input where you can start typing instantly instead of grab-mouse-and-click before you start typing?

        And don't even get me started on the drag-resized panes where the "drag grab" area is so small that you have to have machine-like motor skills to be able to mouse over it, click down, and drag without losing the whole operation.

        But when it comes to gratuitous and annoying auto-playing audio-visuals, we're great!

        • But the alternative sucks even more. If a web page wants to allow keyboard shortcuts, then it has to not conflict with the browser's shortcuts. I've been annoyed far too many days back in the flash days where I hit Ctrl-T or Ctrl-W but nope, the flash plugin grabbed it. One thing Windows does well hotkey-wise is it put all the OS shortcuts on Win+____ and nothing conflicts with them. Browsers, on the other hand, both use Ctrl+___ and Alt+___ for their hotkeys, so anything that wants to use those within the
      • We devolved from the desktop days.

        Speak for yourself, I'm still in the year of the linux desktop, and I'm still using a 90s-style desktop paradigm.

        And in the browser if you use scriptblock, then sites without a traditional interface won't even look usable; you'll be spared entirely. The worst crap just obviously didn't load right, and you look for a site with legit content.

        You don't have to devolve, just increase your lawn security.

    • And here I thought it would be Chartreuse. Thinking about it , the color would be too close to Razer/Kailth green. Maybe Cherry transparent greens.

      With fully programmable million+ color backlighting.

      • And here I thought it would be Chartreuse.

        I've already been using chartreuse in most of my work, ever since I saw St. Wall using it in the `90s for his "home page." He's still doing it, and so am I. http://wall.org/~larry/ [wall.org]

        Client: Why is the site so ugly?
        Me: So that you'll hire a web designer to fix the CSS when I'm done with the backend :)

  • Windows batch files, written in Notepad.
  • by bigpat ( 158134 ) on Monday December 07, 2015 @12:11PM (#51073171)

    In ten years I intend to be programming in management speak, functional specifications and almost completely useless and barely intelligible pseudo code.

    • by rwa2 ( 4391 ) *

      Yep, this. And those programmers will only be writing a handful of malformed TDD test cases and toss them over the fence to a foreign shop to search stackexchange for random bits of code that makes those test cases pass without much understanding of what the original problem was.

      • >> TDD test cases and toss them over the fence ...makes those test cases pass without much understanding of what the original problem was

        As designed. That's how TDD breaks up work...

  • And likely training my lower-cost Vietnamese replacement.
  • by LynnwoodRooster ( 966895 ) on Monday December 07, 2015 @12:30PM (#51073307) Journal
    Poorly.
  • by DaPhil ( 811162 ) on Monday December 07, 2015 @12:32PM (#51073333)

    It seems to me that you need the languages with the right features to be able to implement good tool support. Consider the excellent IDEs that have been created for Java (Eclipse, IDEA, NetBeans) with extremely advanced refactoring capabilities, code navigation, and inline compilation with meaningful error messages. Such support requires the ability to do static analysis, which you can't do properly in some of the newly popular languages like JavaScript.

  • Source files will be in machine readable format that conveys meaning, while individual programmers will have a choice of textual, graphical and hybrid representations to work on that meaning, So most people will just drag and drop an image into a source editor and start using it without worrying about how it is stored in the application bundle. But if another programmer on your project uses vi and wants to explicitly refer to R.drawable.pacman, they can.

    • Source files will be in machine readable format that conveys meaning, while individual programmers will have a choice of textual, graphical and hybrid representations

      I think I've seen that already, it was called "machine language".

      • by iamacat ( 583406 )

        Machine language does not capture full meaning intended in the source, although JVM comes close.

  • by bluefoxlucid ( 723572 ) on Monday December 07, 2015 @12:37PM (#51073383) Homepage Journal

    The article is like, "Hey! Look! Android! Containers! New execution environments! IDEs!"

    Meanwhile I learned to code in Quick Basic 4.5 in a procedural model. I then started doing functional programming in C, and that whole "modular" thing where we break out programs into chunks. Object oriented programming was in relative infancy, and I learned that when it was just wrapping up related stuff into objects.

    We now have more complex design patterns. The Gang of Four book and Code Complete are a mess to read; Tony Bevis did a better job writing a clear, concise explanation in C# [amazon.com] and Java [amazon.com].

    It's not the tools and the languages; it's the method of problem solving. Project Management today is not the same as Project Management in 1980 (I'm CAPM certified). Engineering isn't the same. We've created new construction techniques, not just new materials and tools. Programming hasn't just advanced in terms of languages and system platforms; we've created new methods for writing enormous programs without doing a shitton of refactoring.

    I haven't assimilated the new methodologies yet. I can't plan in a grand scale using those tools; my brain knows how to use the old ones and can project at low resolution, then fill in all the gaps at high resolution. I need to burn these new abstract factories and decorators and other bullshit into my contextual thinking before I can just throw down immensely-complex, well-architected computer programs. I know the whole deal with being from the old school, and i know how hard it is to change; I also know what worked for the last set of problems doesn't fit this new set. That's sort of foundational knowledge for me [wordpress.com]: the correct approach depends on the problem, not on what your favorite tools are.

  • There will be task appropriate languages/tools such as C for lower level embedded/drives/OSs and evolving, more abstract, languages such as Java/Python/etc and folks will be continuing to wrongly apply high level tools to low level tasks. So, no I see no driver for change in the future. It would be cool to see an on chip hardware data stack added to CPUs (CPU side of the pipe) and supported in C with a Forth like statements. It will never happen, but one can still dream of the performance improvements.
  • by Natales ( 182136 ) on Monday December 07, 2015 @12:49PM (#51073457)


    CI/CD systems will automate the heck out of everything, and there will be less and less visibility into what's running where and how.

    "Cloud Native" applications designed around microservices with well-defined interfaces and running in some PaaS "somewhere" will become the norm. I sadly foresee that developers themselves will be expected to become microservices, basically expected to do one thing only, and one thing well, and forbidden to look beyond their immediate horizon of the ever rolling Agile backlog. There will be less space for creativity at the individual level, and massive invisible machine learning software running in the back-end of the datacenters will automatically generate "facts" for the suits in charge, and possibly even stories on a backlog based on those facts. In 20 years, they'll generate their own code.
  • If you get too old the insurance costs to employer go up. But you are not allowed to discount that in the market, because that would be "discrimination". But they can refuse to hire you in the first place and make up some non discriminatory reason. So the reason old folks can't get a job doing a skill they know how to do, is because government decided to "help" them.

    Instead, we must con women to do what they (quite rationally) don't want to do, so we can get our statistics right.

    Remember this the next t

    • single player healthcare or no more employer based system will likely be in the us in 10 years

    • by PPH ( 736903 )

      ... as a direct employee. So become a contractor and take care of your own insurance. If you are in good health, you can buy a plan for a reasonable premium. When I left Boeing over a decade ago, the cost to pick up their insurance as a COBRA [wikipedia.org] plan was significantly higher than private insurance. And my premiums today are still lower than their plan was back then. Due, I imagine, to the age distribution and relatively poor health of their average employee.

      Instead, we must con women to do what they (quite rationally) don't want to do, so we can get our statistics right.

      Minor logical error there. Sit in on an HR strategy

  • by shess ( 31691 ) on Monday December 07, 2015 @12:55PM (#51073503) Homepage

    Ten years ago, I was coding gnarly C++. Today it's even more gnarly because the projects are bigger and the problems more subtle. I think my only way out of this trap will be to make a conscious decision to stop, but even if I opt out, others will be in there doing the same basic stuff to make everything keep running.

    The Objective-C knowledge I began developing in 1988 will probably be less useful in ten years, though. If you had asked me in 1995 if I would be intentionally avoiding Objective-C work in 2015 because of burnout, I would have laughed at you.

    I hope that my Perl knowledge will be useless in ten years, but I fear that it will be the most lucrative system I know.

    In the 80's, software-engineering was an optimistic industry, structured programming had helped so much, object-oriented programming seemed likely to make things easy, logic programming was going to automate a lot of stuff, we were going to move upstream to direct solvers and provers. Sometime in the 00's, everyone gave up and decided that optimism was overrated, software-engineering would never earn the "engineering" part, so instead let's just try to mitigate the vicious cycles to keep them from going too far foul. I think in ten years, things are going to look basically the same as today, with minor evolutionary additions, and we might even argue about whether things have changed enough to be worth talking about.

  • by Greg Merchan ( 64308 ) on Monday December 07, 2015 @12:56PM (#51073515)

    If you'd told me 10 years ago that I'd mostly be programming in LabVIEW today, I would have laughed. It's "not a real language". It's proprietary. Manipulating graphics takes so much longer than typing. Etc.

    I still don't like that it's proprietary.

  • by DFDumont ( 19326 ) on Monday December 07, 2015 @12:56PM (#51073517)

    I suspect given the trends of the past decade that there will be more pseudo-code looking scripts written in language du jour, than actual code. API calls, to API calls that invoke still other APIs, without any understanding of what is actually being executed or on what platform it is executing. There has been a lot of effort put into making 'coding' simpler and more distributed which has many faults. First and foremost the simpler it is to code, the dumber our coders become. Similarly the more distributed we get, the harder it is to diagnose problems.
    It used to be that a good debugger was all you needed. Now you can barely even tell what is going on without a sniffer trace, and even that will leave you wanting for some piece of the puzzle. I'm not suggesting a return to the days of COBOL, but not all advances result in better code.

  • Whatever the language or application, the one constant for me has always been the editor. I'm sure that will remain the same.

  • Unfortunate name there...
  • by engineerErrant ( 759650 ) on Monday December 07, 2015 @01:32PM (#51073943)

    My Scrum Lord says that I'll drive peak stakeholder value for a billion years if I but open my heart to the One True Methodology.

    • Every Scrum practitioner, worth his salt, will tell you: there is no one true methodology.

      There are plenty of works, that can not be done in Sprints, e.g.

      Imagine an emergency treatment center ... priorities shift as injured victims get delivered. And when for a 3 weeks no single patient shows up, the only work you have "done" is your paperwork ... but who cares.

      I hate those Scrum haters who have no clue about Scrum (or XP, or Kanban or Crystal Methods or other agile methods, because they are to dumb to gras

  • a. all the hype of new languages, frameworks, and platforms will be debunted as fast as the blitz/viral marketing efforts that we see today.
    b. the "myster" of coding will be old-hat from a nomenclature stand point. Everyone will recognize (not necessarily understand) FIFO queue, certs, etc..., even understand what a buffer overflow means. It will not effect careers & salaries in s/w, but will call out overrated tasks.
    c. As much as Silicon Valley wants coders to be rock stars (e.g. as in Silicon Valley)

  • Visual Haskell (Score:5, Interesting)

    by TuringTest ( 533084 ) on Monday December 07, 2015 @01:37PM (#51074007) Journal

    There's a wealth of new research going on in Programming Language Theory, with several breakthroughs in the last years bridging the gap between functional and imperative programming.

    The other trend in declarative programming is reactive languages like React.js and Flux being applied to user interfaces. This allows for tools like React Native which can abstract away all the spaghetti code to handle events, providing a higher abstraction, including the "debug & rewind" and "live programming" capabilities seen in online "web embedded" environments like Github Gist or JSFiddle.

    I expect that, as these techniques mature, they will settle down and allow for development techniques that allow for easy discoverability of APIs without having to learn a particular complex syntax, and better programming by connecting components without the drawbacks and limitations of classic Visual tools.

    All these new techniques based in Category Theory are driving advances in mainstream languages - starting with libraries like Linq and jQuery but also Python, Javascript and even C++ adopting lambdas, advanced type systems with auto-inference of types, and libraries with constructs for declarative race-free parallelism such as promises and agent models.

    The majority of those techniques are being tested first in experimental languages by researchers eating their own dog food, with Haskell often having its most pure form (see what I did there?). Anyone interested in enhancing the expressivity of PLs may lurk Lambda the ultimage [lambda-the-ultimate.org], where guys much more clever than you and me hang around and can give pointers to all the relevant theoretical results.

  • by rlp ( 11898 ) on Monday December 07, 2015 @01:43PM (#51074053)

    Use my neural interface to write a program to a data crystal that can display on the holodeck. Then leave work in my flying car.

  • As I continue to work in array-oriented languages like APL or J, as I have for years, it's interesting how very slowly the new languages are re-discovering things we've known for decades. As someone I know said, "Google invented map-reduce in 2004 and Ken Iverson cleverly re-invented it in 1964".

    Eventually, the idea that novice errors are irrelevant to language design may slowly work its way into the mainstream, along with any number of other unrecognized language desiderata that will seem obvious in retro

  • I have been writing "enterprise software" (boring, lucrative software used by big business) for 15 years. Little has changed but the vocabulary, and even then it is synonym based changes ("Hosted" vs "Cloud", "GUI" vs "Single page application", etc). Sure I use new frameworks and describe my work in "new paradigms" but it is all just the same.

    No matter what you do, or how you describe it, what tools you use, or even how you plan it, at a certain point you just have to do the thing.. actually write the
  • But honestly, I have no clue. I'm still trying to catch up on all the programming paradigms that were the new hotness 3 or 4 years ago.
  • Probably the same way I do now.

    Spend the first quarter of the day in meeting either day dreaming or staring off in to space. Spend the second quarter of the day responding to email, talking to customers on the phone, or talking to coworkers about interfacing our code. Spend the third quarter of the day doing some vim. Spend the fourth quarter of the day browsing the internet and waiting to go home.

    Really, the actual typing away part of my job is pretty small. I probably spend at least as much time whiteboa
  • Languages wise

    1. Hopefully the death of Java and similar GC knock-off languages like C#. World would be a really nice place without them.
    2. Expect demise/marginalising of dynamic languages like Ruby, Python, etc (Python may survive for a bit, as some of the NLP libraries are written in it).
    3. Expect JavaScript to be the de-facto language. In fact, it has become on the web. But yet to see it getting closer to OS and general hardware.
    4. C/C++ will remain there as long as there are hardware and peripherals. Fo

  • I sincerely hope it is not *still* with a mouse and keyboard. For me it is the least sucky way to interact with my pc, I wish something existed that would allow me to be more productive, I spend too much time moving the mouse around and typing.
    • You better hope we still have mice and keyboards, 'cause if we don't we'll probably be forced into using some kind of horrendous touch interface. (shudder) Or even worse: voice! Can you imagine the typical steno-pool sea of developer cubicles with everyone jabbering code at their computer? I doubt neural I/O will ever be practical in my lifetime, or at least not as efficient as keyboard/mouse.

  • by musicmaker ( 30469 ) on Monday December 07, 2015 @04:33PM (#51075737) Homepage

    After nearly 20 years doing this, I believe that what software developers actually do all day long is primarily ontology, and secondarily engineering. Conceptually fitting the real world into a little box filled with transistors is hard. You can't automate thinking (at least not yet). The computer world is a limited representation of the real world, and that translation, deciding which things to take and which things to leave, and what shape they take in the virtual, is something that a computer cannot do, and sure won't be able to in 10 years time, possible even in a 100 years time. Until then, programmers will be doing the same thing they do today, just in a slightly faster format with slightly updated tools and slightly slicker interfaces: crafting a virtual and limited representation of the real that allows modeling and the generation of knowledge and value from information from data.

For God's sake, stop researching for a while and begin to think!

Working...