Ask Slashdot: How Will You Be Programming In a Decade? (cheney.net) 279
An anonymous reader writes: Programmer Dave Cheney raised an interesting question today: How will you be programming in a decade? If you look back to a decade ago, you can see some huge shifts in the software industry. This includes the rise of smartphones, ubiquitous cloud infrastructure, and containers. We've also seen an explosion of special-purpose libraries and environments, many with an emphasis on networking and scaling. At the same time, we still have a ton of people writing Java and C and Python. Some programmers have jumped headfirst into new tools like Light Table, while others are still quite happy with Emacs. So, programmers of Slashdot, I ask you: How do you think your work (or play) will change in the next ten years?
Easy. (Score:5, Funny)
With a gesture-based interface connected to my fishing rod.
Re:Easy. (Score:4, Interesting)
I do see software development roles split between people still writing code and people using graphical (perhaps gesture based) interfaces designing workflows, approval processes, user interfaces, etc. Not sure how fishing rods factor in.
I think my recent work with Salesforce has given a good glimpse of the future of software development, at least in the next decade or two that is. 90% of the work I would have done a decade ago is now handled by a third party platform, and I just work on the few things that need to be custom. That has been attempted by SAAS vendors before (even before it was called that), but never as well as Salesforce has done it. There is plenty of room for improvement, but their software gives an idea of what can be accomplished. Though I hope someone else beats out Salesforce's Force.com platform with something that is more engineering focused instead of sales/marketing focused.
I see the software development industry breaking up into tiers like most other industries. Similar to engineering where you have engineers and you have CAD operators (among other roles). I see elite software engineers making much more money than they do now, but a class of programmers making wages closer to CAD operators (although a bit more) becoming the norm for most programmers. Overall it will let the industry create more software with less costs.
Re:Easy. (Score:4, Insightful)
Not sure how fishing rods factor in.
Fishing rod == retirement. Or at least so I'm guessing.
I'll be retired in a decade, so I'll just be programming for fun. That's how.
Re: (Score:2)
Re: (Score:3)
Oh man, it's like everyone's got a realistic plan for retirement except me.
Looking forward to anything really great (Score:3)
It was assembler first. Then FORTRAN and BASIC and assembler. Then assembler and C. Then C and Perl. Then C and Python.
It's been C and Python ever since.
The shift from assembler was forced on me because the underlying platforms began to diverge; C took care of that, while remaining low level enough not to suffer the slings and arrows of clunk, lethargy, and various types of safety nets of a hoop-jumping nature.
Perl put a moderate amount of readily accessible speed and a great deal of power on the table. Tha
Re: (Score:2)
The thing about C++ is that you can't just learn the C++ to be relevant to be employers. That knowledge has to include at least STL if not Boost as well as some GUI (Qt) and multi-threading (Intel TBB). Other companies may have moved onto using Python with PyQT and taken parallel processing up to PyCUDA.
Re: (Score:3)
"Is that a zit on your cheek?"
"No, I'm growing a database."
Re: (Score:2)
I don't see how this is different from the past where most programmers "coded" in RPG, or in Delphi, or in Powerbuilder, or in MS Access.
Re: (Score:2)
I don't see how this is different from the past where most programmers "coded" in RPG, or in Delphi, or in Powerbuilder, or in MS Access.
Not much different. Just like the iPhone wasn't much different than the Newton. But at a certain point technologies become mature enough they can deliver on the hype that has built up for decades.
What tools like Powerbuilder and Access were not good at, IMHO, is they weren't built upon an enterprise grade infrastructure. Perhaps I am not being fair to Powerbuilder since I only worked on one project with it and as a junior developer, but it didn't seem as extensible as even a VB application. All I know is ou
Wrong question. (Score:2)
Correct question is - How would you be programmed in a decade...
Re: (Score:2)
Re: (Score:2)
With a gesture-based interface connected to my fishing rod.
Is that a euphemism for something? Yeah, I think I've seen that interface in Sex & Zen 2
Re: (Score:2)
Now I get it, you mean that rod.
Re: (Score:2)
With a gesture-based interface connected to my fishing rod.
Maybe not that, but I'd love it if we had something where no knowledge of programming languages were needed, and it was mostly a case of click, drag, drop... I recall that NEXTSTEP had something like it, and to an extent, so did C++Builder, JBuilder and TurboPascal from Borland. Something that could be easily generated would be fantastic!!!
Re:Easy. (Score:4, Insightful)
Well, as you probably guessed I've been around a long time, and this idea comes up over and over again, and it never takes off, and for a good reason. Programming is hard; it's deeply tied to logical reasoning, which in turn is tied to language and notation. Having visual representations as an adjunct often does make reasoning easier, but having only visual representations does not.
Through the years I've met a number of people who claim to be "visual thinkers", but in fact I don't think most people who make that claim are particularly good at visual thinking. What they really mean is they want things kept simple so they don't have to work that hard; when confronted with visual subtlety or complexity they're just as lost as when they are confronted with linguistic complexity. Basically they're mentally lazy but prefer to think of themselves as misunderstood.
Now there are people who are great visual thinkers. Any decent graphic designer is bound to be a strong visual thinker. But oddly enough it's not graphic designers who make this claim. It's usually managers who don't have the patience to read through pages of text; but they don't have the patience to wade through pages of diagrams, either.
I plan on ossifying (Score:5, Insightful)
Learning new languages every six months in a young man's game. As I get older, I will gravitate towards jobs where I can leverage 15+ years experience in a language to get better-paying positions.
Re: (Score:2, Interesting)
Going forward more and more it's not even about the language so much as the systems. Everyone can learn java and pick up the tool stack, but the domain knowledge and systems experience and hell just knowing the right contacts becomes very valuable in long term industries (aerospace, defense, medical, etc).
As long as you pick something that doesn't get entirely replaced, it's a good way to spend the last 15 or so years of your career, and even if it does get entirely replaced, they're gonna need a lot of tha
Re: (Score:2, Interesting)
It's also an idiots distinction. There are very few 'new' languages. Most of them are just syntactic changes that integrate more or less of the C++ standard library so that you can do a particular task with less boiler plate. There is nothing wrong with that, but I get annoyed when fanbois keeps going on about how 'new' languages are somehow revolutionary.
It is like the whole functional programming thing. I spent a while working my way through blogs going on about how amazing functional programming is, whil
Re:I plan on ossifying (Score:5, Insightful)
Easy enough to move from language to language, but toolstack to toolstack less so. If you've used c++ you can "learn java" very quickly, but learning the increasingly complex libraries and frameworks that tend to accompany it can take awhile. Even if you've worked with similar tools, it can take awhile to learn all the best practices and shortcuts and little nuances.
It even extends beyond programming itself. Methodologies change and the toolstack used to implement those methodologies changes with them. We've generally migrated from bug trackers (bugzilla, mantis, etc) to project trackers (trac, redmine), and chances are in a few years we'll be doing something else.
People joke about old men stuck in their way, but as I get older I kinda get it. After a few iterations my enthusiasm to learn the next greatest thing has waned, and it feels like something I have to do rather than something I want to do, and the gain starts to feel less worth it. Is gradle really that much better than maven? Was maven really that much better than ant+ivy? Once I become a gradle guru, something is just gonna come up and replace it as the defacto, so why even bother?
The only solution is to become a manager and become the roadblock we all hated when we first started.
Re: (Score:2)
In the 1970's we learned a new language every month. Programmers were expected to read the manual - all 10 pages of it - and then start using it.
Currently, you are expected to have five years experience of software released 6 months ago.
In 10 years time, you will be expected to have 50 years experience of a product on the day it is released.
OR ...
still be writing PHP5 by throwing virtual cow-pats at the virtual (server) farm on the screen with your Wiimote, as
Re: (Score:2)
LOL "all 10 pages of it"
https://en.wikipedia.org/wiki/... [wikipedia.org]
Sorry kiddo, your grandpa was off his meds when he told that story.
Re: (Score:3)
What ten pages? I learned C without that book. You can learn C well from a cheat sheet one page long if you already know another low-to-medium level programming language; well enough to read and write simple programs, you'll certainly need to learn more for more complex stuff (pointer arithmetic). But almost every language is mastered by learning just a little bit at first, then a little more later, then a little more, etc.
I had a boss once who learned C in 21 days, he had the book on his shelf as proof.
Re:I plan on ossifying (Score:4, Informative)
until I came across a guy who had come to it from C and was like 'yeah, so basically callback functions with a loose stack implementation ...
Except that functional programming is much more than that
The C++ analogy would be: have a class with overloaded operator(), its "objects" then behave as functions. You can return such functions from ... erm ... functions.
In real functional languages you can compose new functions on the fly and either use them as parameters, or result types or apply them to arguments.
And, for the C crowd: a function is not an address to a piece of code you call, it is more a piece of "data" you allocate with malloc and interpret later. Or in C++: it is a so called "first class citizen" like a class or a struct.
Re: (Score:3)
Exactly you're forgetting one very little important thing:
* Assembly Language (and/or the micro-code of the CPU)
At the end of the day you still have an sequential IP (Instruction Pointer), you still have registers, you still have tests, you still have memory, and you are still doing some transform on data.
Lisp machines died out years ago -- that means the fundamental _underlying_ hardware IS basically C's model.
> Paul Graham once commented that the trend in language design was to take C and add Lisp feat
Re: (Score:2)
Even Emacs is written in C. The idea that there is a fundamental difference is incorrect. The differences are subjective and related to the programmer experience and the set of metaphors that are used.
Re: (Score:2)
Learned Perl over Columbus Day weekend in 1992 as an E-4 in the Air Force; still using it today, for contracted and open-source projects.
Re: I plan on ossifying (Score:3)
I'm sorry to hear that.
Re:I plan on ossifying (Score:4, Insightful)
It's not the language at all. It's the way to structure an application that's changed drastically. I used to write server-based apps, with a smart terminal front end. It made for a nice, simple, supportable structure, with a reasonable GUI. Recently, I've delved into web programming. Javascript is fine as languages go - though the various libraries built around it are probably more difficult to get a handle on. But the main surprise is what has come to constitute an application. To the extent that there's an application, per se, it consists of Javascript code in the browser, with data accessed via services on the back end. And mostly on a single page basis. In other words, the surprise is that there's no module that counts as an overarching 'application' that defines a structure encompassing a large set of functionality. I have no idea how this structure would scale up beyond a small set of web pages. Not scale in terms of being able to support a large number of users, but in terms of anybody knowing (or remembering) how all the bits of code fit together.
Re: (Score:2)
As I explain in my previous comment, [slashdot.org] the most recent changes in structure are being driven by theoretical advances in Category Theory (the "abstract nonsense" that brought us monads and LINQ).
This means that they are particularly well adapted, as Category Theory is the science of composing small parts to build a large structure without scaling problems. In theory, programs using these techniques should be easier to understand and maintain, at least once that you get a preliminary grasp of the underpinnings
Re: (Score:2)
Agreed. Ten years ago I was programming mostly in C++ and C#. Ten years from now I'll be programming mostly in C++ and C#. While our shop has lots of different user interface platforms for the same product, ranging from PowerScript to XAML to HTML5, I don't see the core code changing in ten years. It just gets to wear different clothes, according to the style at the time.
Re: (Score:2)
I dunno. Learning a new language is easy. It's the APIs they come with that's a bitch.
We won't be. (Score:2)
10 Years (Score:3)
Re:10 Years [damned UI's] (Score:5, Insightful)
You are behind, dude, Cherry MX Periwinkle Switches Reloaded++ is now out.
Seriously, who the hell knows what's 10 years down the road. The industry is driven as much by fads as logic, if not more.
I just hope the UI side simplifies so that one doesn't have to say diddle with the minutia of scroll-bar coordinates for everyday GUI idioms and bread-and-butter CRUD. I'd like to focus on domain logic rather than micromanage UI glitches all day.
UI's are f8cking mess unless you target a specific browser brand and version. We devolved from the desktop days. I pray the industry cleans up the UI mess created by the browser. Unfortunately the industry seems to be chasing eye candy fads instead instead of practical things, but I guess the money is in hype and flash.
In summary, get off my UI lawn!
Re:10 Years [damned UI's] (Score:4, Interesting)
We devolved from the desktop days.
Oh yes. One of the worst things that browsers did was virtually destroy the ability to use shortcut keys to do useful work instead of having to grab mouse and irritate carpal tunnels. All the shortcut keys now either do nothing or control the browser, not the app in the browser.
Plus far too many webpage authors don't leverage what few amenities we could have. For example, how many form-based pages have you visited where there's a preselected input where you can start typing instantly instead of grab-mouse-and-click before you start typing?
And don't even get me started on the drag-resized panes where the "drag grab" area is so small that you have to have machine-like motor skills to be able to mouse over it, click down, and drag without losing the whole operation.
But when it comes to gratuitous and annoying auto-playing audio-visuals, we're great!
Re: (Score:2)
Re: (Score:2)
We devolved from the desktop days.
Speak for yourself, I'm still in the year of the linux desktop, and I'm still using a 90s-style desktop paradigm.
And in the browser if you use scriptblock, then sites without a traditional interface won't even look usable; you'll be spared entirely. The worst crap just obviously didn't load right, and you look for a site with legit content.
You don't have to devolve, just increase your lawn security.
Re: (Score:2)
And here I thought it would be Chartreuse. Thinking about it , the color would be too close to Razer/Kailth green. Maybe Cherry transparent greens.
With fully programmable million+ color backlighting.
Re: (Score:2)
And here I thought it would be Chartreuse.
I've already been using chartreuse in most of my work, ever since I saw St. Wall using it in the `90s for his "home page." He's still doing it, and so am I. http://wall.org/~larry/ [wall.org]
Client: Why is the site so ugly? :)
Me: So that you'll hire a web designer to fix the CSS when I'm done with the backend
The old-fashioned way! (Score:2)
Re: (Score:3)
BATCH files are DOS you n00b, and they're written in EDIT for DOS!
Re: (Score:2)
EDLIN son, EDLIN. Now get off my lawn.
[John]
Re: (Score:2)
Using DEBUG.COM, entering the individual characters in hex, then setting cx to the number of bytes and writing it out to disk.
No debug? Copy CON to the file you want, then control-z to end it.
Re:The old-fashioned way! (Score:5, Funny)
I directly load programs into memory though the tape-in port by modulating my flatulence into a microphone.
programming by telling programmers what to program (Score:5, Funny)
In ten years I intend to be programming in management speak, functional specifications and almost completely useless and barely intelligible pseudo code.
Re: (Score:2)
Yep, this. And those programmers will only be writing a handful of malformed TDD test cases and toss them over the fence to a foreign shop to search stackexchange for random bits of code that makes those test cases pass without much understanding of what the original problem was.
Re: (Score:2)
>> TDD test cases and toss them over the fence ...makes those test cases pass without much understanding of what the original problem was
As designed. That's how TDD breaks up work...
Wearing a Turbin and speaking Hindi (Score:2)
Judging by past performance? (Score:5, Funny)
Good tool support needs "good" languages (Score:4, Insightful)
It seems to me that you need the languages with the right features to be able to implement good tool support. Consider the excellent IDEs that have been created for Java (Eclipse, IDEA, NetBeans) with extremely advanced refactoring capabilities, code navigation, and inline compilation with meaningful error messages. Such support requires the ability to do static analysis, which you can't do properly in some of the newly popular languages like JavaScript.
Move away from static text (Score:2)
Source files will be in machine readable format that conveys meaning, while individual programmers will have a choice of textual, graphical and hybrid representations to work on that meaning, So most people will just drag and drop an image into a source editor and start using it without worrying about how it is stored in the application bundle. But if another programmer on your project uses vi and wants to explicitly refer to R.drawable.pacman, they can.
Re: (Score:2)
Source files will be in machine readable format that conveys meaning, while individual programmers will have a choice of textual, graphical and hybrid representations
I think I've seen that already, it was called "machine language".
Re: (Score:2)
Machine language does not capture full meaning intended in the source, although JVM comes close.
Article and comments missing the point (Score:3, Insightful)
The article is like, "Hey! Look! Android! Containers! New execution environments! IDEs!"
Meanwhile I learned to code in Quick Basic 4.5 in a procedural model. I then started doing functional programming in C, and that whole "modular" thing where we break out programs into chunks. Object oriented programming was in relative infancy, and I learned that when it was just wrapping up related stuff into objects.
We now have more complex design patterns. The Gang of Four book and Code Complete are a mess to read; Tony Bevis did a better job writing a clear, concise explanation in C# [amazon.com] and Java [amazon.com].
It's not the tools and the languages; it's the method of problem solving. Project Management today is not the same as Project Management in 1980 (I'm CAPM certified). Engineering isn't the same. We've created new construction techniques, not just new materials and tools. Programming hasn't just advanced in terms of languages and system platforms; we've created new methods for writing enormous programs without doing a shitton of refactoring.
I haven't assimilated the new methodologies yet. I can't plan in a grand scale using those tools; my brain knows how to use the old ones and can project at low resolution, then fill in all the gaps at high resolution. I need to burn these new abstract factories and decorators and other bullshit into my contextual thinking before I can just throw down immensely-complex, well-architected computer programs. I know the whole deal with being from the old school, and i know how hard it is to change; I also know what worked for the last set of problems doesn't fit this new set. That's sort of foundational knowledge for me [wordpress.com]: the correct approach depends on the problem, not on what your favorite tools are.
Re:Article and comments missing the point (Score:4, Interesting)
You can not do functional programming in C
Looks like you're right; I'm one paradigm off. When I started, the programming books I used didn't talk about using subroutines as a major programming structure; using function calls was new when I got into C.
Understandable if your background is so limited
The books are a mess to read. They're not well-organized, they're not well-written, and they don't convey information. They have a lot of information, but it's organized like shit.
Imagine if you got in a car with 7 pedals. Depending on what combination of pedals you hit, the accelerator or brake may come on, and the gears may switch to a particular configuration. To accelerate in third, you need to hit pedal 3 and 5; to accelerate in first, you need to hit pedals 2 and 7; to brake, you hit pedals 4 and 7. Is your difficulty driving this beast a matter of your background being limited, or the interface being fucking retarded?
Human memory is associative, and heavily benefits from organization.
consider how far Tony Bevis would have get if he had not the shoulders of giants to stand on
He took the disorganized mess out there and produced a couple books covering concepts in ways people can more readily understand. That reduces the amount of time a person must invest to develop a particular skill. That's the same thing the original GoF and Code Complete books did, except they brought together more information and didn't do it as clearly.
Re: (Score:2)
FYI: C paradigms are called imperative and/or modular.
Well I found the book cool as it basically only structured knowledge I already had (I did a lot of OO design and framework construction at that time, but when we e.g. needed to use a "factory method", we had no "handy name" for it. While stuff was clear on UML diagrams (which still where not in wide use as tools costed in the $5000 range) it was hard to talk about simple things.
As posted to others: I find the book easy to read, but that might be that two
Re: (Score:2)
It was fairly obvious from context that he meant "I became functional as a C programmer".
And, yes, the Gang of Four book was the worst-written technical book I've ever read. Never was something so simple explained so badly. Fortunately, it was mostly about coping with deficiencies in C++ and the world has moved on (even C++ has moved on, though some of those design patterns are still needed).
Re: (Score:2)
Well, as I said before: I find them exceptional well written.
I guess it is a matter of taste.
And from context: it was pretty clear he meant imperative programming but thought because C uses functions, functional programing would be the same. I doubt he meant anything along your interpretation.
But Alas, such are words ... written words even ... people spent lifetimes to decrypt the meaning of simple words in books ;D
Re: (Score:2)
Hm, nice :D
Same old, same old (Score:2)
Extrapolating from today (Score:3)
CI/CD systems will automate the heck out of everything, and there will be less and less visibility into what's running where and how.
"Cloud Native" applications designed around microservices with well-defined interfaces and running in some PaaS "somewhere" will become the norm. I sadly foresee that developers themselves will be expected to become microservices, basically expected to do one thing only, and one thing well, and forbidden to look beyond their immediate horizon of the ever rolling Agile backlog. There will be less space for creativity at the individual level, and massive invisible machine learning software running in the back-end of the datacenters will automatically generate "facts" for the suits in charge, and possibly even stories on a backlog based on those facts. In 20 years, they'll generate their own code.
Re: (Score:3)
I'm down for a NuLisp/Prolog future.
if you get too old you are not allowed to program. (Score:2)
Instead, we must con women to do what they (quite rationally) don't want to do, so we can get our statistics right.
Remember this the next t
single player healthcare or no more empolyer based (Score:2)
single player healthcare or no more employer based system will likely be in the us in 10 years
Re: (Score:2)
Instead, we must con women to do what they (quite rationally) don't want to do, so we can get our statistics right.
Minor logical error there. Sit in on an HR strategy
It will be even worse than today. (Score:3)
Ten years ago, I was coding gnarly C++. Today it's even more gnarly because the projects are bigger and the problems more subtle. I think my only way out of this trap will be to make a conscious decision to stop, but even if I opt out, others will be in there doing the same basic stuff to make everything keep running.
The Objective-C knowledge I began developing in 1988 will probably be less useful in ten years, though. If you had asked me in 1995 if I would be intentionally avoiding Objective-C work in 2015 because of burnout, I would have laughed at you.
I hope that my Perl knowledge will be useless in ten years, but I fear that it will be the most lucrative system I know.
In the 80's, software-engineering was an optimistic industry, structured programming had helped so much, object-oriented programming seemed likely to make things easy, logic programming was going to automate a lot of stuff, we were going to move upstream to direct solvers and provers. Sometime in the 00's, everyone gave up and decided that optimism was overrated, software-engineering would never earn the "engineering" part, so instead let's just try to mitigate the vicious cycles to keep them from going too far foul. I think in ten years, things are going to look basically the same as today, with minor evolutionary additions, and we might even argue about whether things have changed enough to be worth talking about.
Graphically (Score:3)
If you'd told me 10 years ago that I'd mostly be programming in LabVIEW today, I would have laughed. It's "not a real language". It's proprietary. Manipulating graphics takes so much longer than typing. Etc.
I still don't like that it's proprietary.
API hell (Score:3)
I suspect given the trends of the past decade that there will be more pseudo-code looking scripts written in language du jour, than actual code. API calls, to API calls that invoke still other APIs, without any understanding of what is actually being executed or on what platform it is executing. There has been a lot of effort put into making 'coding' simpler and more distributed which has many faults. First and foremost the simpler it is to code, the dumber our coders become. Similarly the more distributed we get, the harder it is to diagnose problems.
It used to be that a good debugger was all you needed. Now you can barely even tell what is going on without a sniffer trace, and even that will leave you wanting for some piece of the puzzle. I'm not suggesting a return to the days of COBOL, but not all advances result in better code.
Emacs, of course! (Score:2)
Whatever the language or application, the one constant for me has always been the editor. I'm sure that will remain the same.
Programmer D. Cheney (Score:2)
But...they told me Agile was eternal... (Score:4, Funny)
My Scrum Lord says that I'll drive peak stakeholder value for a billion years if I but open my heart to the One True Methodology.
Re: (Score:2)
Every Scrum practitioner, worth his salt, will tell you: there is no one true methodology.
There are plenty of works, that can not be done in Sprints, e.g.
Imagine an emergency treatment center ... priorities shift as injured victims get delivered. And when for a 3 weeks no single patient shows up, the only work you have "done" is your paperwork ... but who cares.
I hate those Scrum haters who have no clue about Scrum (or XP, or Kanban or Crystal Methods or other agile methods, because they are to dumb to gras
the jig is up (Score:2)
a. all the hype of new languages, frameworks, and platforms will be debunted as fast as the blitz/viral marketing efforts that we see today.
b. the "myster" of coding will be old-hat from a nomenclature stand point. Everyone will recognize (not necessarily understand) FIFO queue, certs, etc..., even understand what a buffer overflow means. It will not effect careers & salaries in s/w, but will call out overrated tasks.
c. As much as Silicon Valley wants coders to be rock stars (e.g. as in Silicon Valley)
Visual Haskell (Score:5, Interesting)
There's a wealth of new research going on in Programming Language Theory, with several breakthroughs in the last years bridging the gap between functional and imperative programming.
The other trend in declarative programming is reactive languages like React.js and Flux being applied to user interfaces. This allows for tools like React Native which can abstract away all the spaghetti code to handle events, providing a higher abstraction, including the "debug & rewind" and "live programming" capabilities seen in online "web embedded" environments like Github Gist or JSFiddle.
I expect that, as these techniques mature, they will settle down and allow for development techniques that allow for easy discoverability of APIs without having to learn a particular complex syntax, and better programming by connecting components without the drawbacks and limitations of classic Visual tools.
All these new techniques based in Category Theory are driving advances in mainstream languages - starting with libraries like Linq and jQuery but also Python, Javascript and even C++ adopting lambdas, advanced type systems with auto-inference of types, and libraries with constructs for declarative race-free parallelism such as promises and agent models.
The majority of those techniques are being tested first in experimental languages by researchers eating their own dog food, with Haskell often having its most pure form (see what I did there?). Anyone interested in enhancing the expressivity of PLs may lurk Lambda the ultimage [lambda-the-ultimate.org], where guys much more clever than you and me hang around and can give pointers to all the relevant theoretical results.
Simple (Score:3)
Use my neural interface to write a program to a data crystal that can display on the holodeck. Then leave work in my flying car.
Waiting for the rest of you to catch up (Score:2)
As I continue to work in array-oriented languages like APL or J, as I have for years, it's interesting how very slowly the new languages are re-discovering things we've known for decades. As someone I know said, "Google invented map-reduce in 2004 and Ken Iverson cleverly re-invented it in 1964".
Eventually, the idea that novice errors are irrelevant to language design may slowly work its way into the mainstream, along with any number of other unrecognized language desiderata that will seem obvious in retro
15 years ago vs Now vs +10 (Score:2)
No matter what you do, or how you describe it, what tools you use, or even how you plan it, at a certain point you just have to do the thing.. actually write the
As little as possible (Score:2)
Coding, coding never changes (Score:2)
Spend the first quarter of the day in meeting either day dreaming or staring off in to space. Spend the second quarter of the day responding to email, talking to customers on the phone, or talking to coworkers about interfacing our code. Spend the third quarter of the day doing some vim. Spend the fourth quarter of the day browsing the internet and waiting to go home.
Really, the actual typing away part of my job is pretty small. I probably spend at least as much time whiteboa
End of Java (Score:2)
Languages wise
1. Hopefully the death of Java and similar GC knock-off languages like C#. World would be a really nice place without them.
2. Expect demise/marginalising of dynamic languages like Ruby, Python, etc (Python may survive for a bit, as some of the NLP libraries are written in it).
3. Expect JavaScript to be the de-facto language. In fact, it has become on the web. But yet to see it getting closer to OS and general hardware.
4. C/C++ will remain there as long as there are hardware and peripherals. Fo
How Will You Be Programming In a Decade? (Score:2)
Re: (Score:2)
You better hope we still have mice and keyboards, 'cause if we don't we'll probably be forced into using some kind of horrendous touch interface. (shudder) Or even worse: voice! Can you imagine the typical steno-pool sea of developer cubicles with everyone jabbering code at their computer? I doubt neural I/O will ever be practical in my lifetime, or at least not as efficient as keyboard/mouse.
Ontology is the key (Score:3)
After nearly 20 years doing this, I believe that what software developers actually do all day long is primarily ontology, and secondarily engineering. Conceptually fitting the real world into a little box filled with transistors is hard. You can't automate thinking (at least not yet). The computer world is a limited representation of the real world, and that translation, deciding which things to take and which things to leave, and what shape they take in the virtual, is something that a computer cannot do, and sure won't be able to in 10 years time, possible even in a 100 years time. Until then, programmers will be doing the same thing they do today, just in a slightly faster format with slightly updated tools and slightly slicker interfaces: crafting a virtual and limited representation of the real that allows modeling and the generation of knowledge and value from information from data.
Re: (Score:2)
Re: (Score:2)
I know one of the founders of a software house that started up to provide a vertical application and hired 6 programmers to do the work.
One of the directors reportedly asked "What are we going to do with these expensive programmers, once all the programs have been written?"*
Last I checked, they probably had 500 programmers working for them and were making money hand over fist.
====
* You can tell that this was a long time ago. These days we know what you do with "inconvenient" employees. At the drop of a hat.
Re: (Score:2)
Wrong. There will a lot more new hardware requiring low level programmers to make interfaces that connects with modern APIs and such.
As for enterprise software dev, you're probably not far from the truth as GUI based programming has really grown inside corporations and enterprises. GUI do ok with simple designs but when you get into more complex designs involving highly engineered solutions you still need code to be written. Keep in mind that some code can be far easier to read than a GUI in most cases. GUI
Re: (Score:2)
we will just use GUIs to connect inputs and outputs and link up Java-Scripts
And that will become tedious to the point where a text-based description is easier. Some tiny little voice will suggest simply re-purposing the Java Script they're already using, but that voice will be silenced by a chorus of people extolling the virtues of some New Paradigm and its associated language. The cycle will continue. Also, stuff people are learning at no more than 200 level CS courses will be re-cycled, re-named, an
Re: (Score:2)
I can't change the gender I was assigned at birth. That's a matter of public records.
No, but in a decade or three, you may very well be able to change your sex at the genetic level, as well as reset your age.
Re:Coding is for Girls (Score:4, Interesting)
Random feminists do not have access to your birth records. So in the future when anyone can get a CRISPR (or whatever) genetic treatment and suddenly look like a 25-year-old male or female of their choice, this stuff will largely be irrelevant. If there's hordes of stunningly gorgeous, 25-year-old women working in tech jobs, no one will be able to figure out, without some serious digging, if they were actually born female 25 years ago, or if they're actually 75-year-olds who used to be beer-bellied, balding men.
Re: (Score:2)
So in the future ... If there's hordes of stunningly gorgeous, 25-year-old women working in tech jobs, no one will be able to figure out, without some serious digging, if they were actually born female 25 years ago, or if they're actually 75-year-olds who used to be beer-bellied, balding men.
Does saying "you know, I think Jar-Jar was kind of a fun character" count as serious digging?
Re: (Score:2)
...or if they're actually 75-year-olds who used to be beer-bellied, balding men.
Surely our slashdot IDs will still give us away as old fogies.
Re: (Score:2)
Just like there's little stopping you from changing your legal name when you get CRISPRed into a 25-year-old hottie, there's nothing stopping you from creating a new Slashdot account.
Re: (Score:3)
Lol.
Yep, I'll be living as a woman in 10 years, unless I die homeless in a gutter. There really is discrimination out there. It's just that it comes from management. But blame me! Sure, that's done a lot of good to fix the problem!
I will not be programming, at least not professionally. The gender insanity will continue in tech. I can't change the gender I was assigned at birth. That's a matter of public records
The requirements for changing the gender marker on birth certs, etc., has gotten a lot easier since the turn of the century. It's in recognition of several facts:
Re: (Score:2)
telling the robots what it is we want
I was involved in this sort of thing over 20 years ago. It worked well for certain classes of applications. On the other hand, it never went anywhere. I suspect that it never will. Because the people* that fought it back then and will sabotage it in the future are the CS grads who will be needed to implement it but will be put out of jobs.
*The people that built our natural language recognition/code generation systems were a bunch of mechanical and electrical engineers. The computer science people were the
Re: (Score:3)
>> a runtime that runs 100% the same on all platforms
(spits out milk through nose)
Re: (Score:2)
>> a runtime that runs 100% the same on all platforms
(spits out milk through nose)
It's easy enough to write java programs that work on *nix and not on windows. Create two class files that are case-significant, such as MyClassFile (for the class itself) and MYCLASSFILE (for final variables) in the same directory. Copying these files to windows fails, as one overwrites the other.
Re: (Score:2)
Re: (Score:2)
I use a specially trained butterfly that flaps its wings just right...
Oops, I stepped on your butterfly - there goes the future !
Re: (Score:2)