Ask Slashdot: Why Are We Still Writing Text-Based Code? 876
First time accepted submitter Rasberry Jello writes "I consider myself someone who 'gets code,' but I'm not a programmer. I enjoy thinking through algorithms and writing basic scripts, but I get bogged down in more complex code. Maybe I lack patience, but really, why are we still writing text based code? Shouldn't there be a simpler, more robust way to translate an algorithm into something a computer can understand? One that's language agnostic and without all the cryptic jargon? It seems we're still only one layer of abstraction from assembly code. Why have graphical code generators that could seemingly open coding to the masses gone nowhere? At a minimum wouldn't that eliminate time dealing with syntax errors? OK Slashdot, stop my incessant questions and tell me what I'm missing." Of interest on this topic, a thoughtful look at some of the ways that visual programming is often talked about.
April 1st isn't for a few more months (Score:5, Informative)
I think the /. folks think it's an early April Fools day. Not write code using text? That's like saying, write a book with pictures. Sure it can be done, but it doesn't apply to all books.
Maybe beta is an early April Fools joke too.
Re:Labview (Score:4, Informative)
I use Labview all the time and it does exactly as advertised. I'm a hardware guy but occasionally need things done in software. Sure its not optimal but it gets the job done.
How are circuits designed today if they are not drawn?
It's because you get bogged down (Score:5, Informative)
So-called "visual programming", which is what you're wanting, is great for relatively simple tasks where you're just stringing together pre-defined blocks of functionality. Where you're getting bogged down is exactly where visual programming breaks down: when you have to start precisely describing complex new functionality that didn't exist before and that interacts with other functionality in complex ways. It breaks down because of what it is: a way of simplifying things by reducing the vocabulary involved. It's fine as long as you stick to things within the vocabulary, but the moment you hit the vast array of things outside that vocabulary you hit a brick wall. It's like "simplfying" English by removing all verb tenses except simple past, present and future. It sounds great, until you ask yourself "OK, now how do I say that this action might take place in the future or it might not and I don't know which?". You can't, because in your simplification you've removed the very words you need. That may be appropriate for an elementary-school-level English class where the kids are still learning the basics, but it's not going to be sufficient for writing a doctoral thesis.
Look at RpgMaker (Score:5, Informative)
Kind of a weird example but RpgMaker is a tool that lets non-programmers create their own RPG games. While there is a 'text based code' (ruby) layer a non-programmer can simply ignore it and either use modules other people have written or confine their implementation to the built in functionality.
Now look at the complexity involved in the application itself to enable the non-programmer to create their game. Dialog boxes galore, hundreds of options, great gobs of text fields, select lists, radio buttons. It's just overflowing with UI. And making an RPG game, while certainly complex, is a domain limited activity. You can't use RpgMaker to make a RDBMS system, or a web framework, or a FPS game.
The explosion of UI complexity to solve the general case -- enable the non-programmer to write any sort of program visually-- is stupendously high. WIth visual tools you'll always be limited by your UI, and what the UI allows you to do. Also think of scale, we can manage software projects with text code up to very high volumes (it's not super easy, but it's doable and proven). Chromium has something like 7.25 million lines of code. I shudder to think how that would translate into some visual programming tool.
I'm not sure how well it would scale
Simple: Text is most expressive (Score:4, Informative)
All other potential "interfaces" lack expressiveness. Just compare a commandline to a GUI. The GUI is very limited, can only do what the designers envisioned and that is it. The commandline allows you to do everything possible.
So, no, we are not "still" using text. We are using the best interface that exists and that is unlikely to change.
Re:Labview (Score:4, Informative)
How are circuits designed today if they are not drawn?
They are synthesized by XST, Synplify Pro, or a similar tool.
Slashcott Feb. 10-17!
Re:Lego Mindstorms (Score:5, Informative)
Try Lego Mindstorms
Be aware that the lego NXT software (haven't tried the EV3 stuff yet) is seriously crippled compared to labview (which it was based on), in particular you can't take "wires" in and out of structure blocks.
I have used labview a bit and find serveral things annoying.
1: There is no zoom functionality (apparently this is the #1 most requested feature)
2: unlike variable names in traditiona code wires in labview typically don't have names. This makes it hard to understand what each wire is for (yes i'm pretty sure there is a way to label them, but it's something you have to do extra not something that naturally comes as part of the coding like in traditional languages)
3: I can never remember what all the little pictures on the blocks mean.
4: I find connecting the blocks very fiddly.
Having said that some people seem to like it.
Re:Power. (Score:5, Informative)
As an EE I call complete bullshit on this. Other than simplistic circuits most modern electronics design is done just as software: textually. Ever heard of Verilog and VHDL? These have largely replaces schematics. You can't have a schematic for a modern complex IC. The schematic would cover a fairly large state like Florida for something like a SPARC T5 or Core i7. There are so many pathways that even labeling the lines would be problematic. The days of something like CS0 are over. VHDL supports structured naming. These days netlists for complex chips are huge.
The tools and techniques for IC design have changed to a more textual mechanism precisely because text is better at dealing with complex abstractions. Please don't tell us what EEs do if you aren't an EE.
Re:I think IBM is working on it (Score:5, Informative)
I'd also throw in, we're still writing text based messages - even though competing pictogram systems have been developed, none seem to have caught on well enough to displace the written word, composed of characters from a relatively small alphabet.
I made a LabViewLike graphical system for compiling algorithms into parallel processors as part of my Masters' thesis. It used a schematic capture program to build the hierarchical graphic "data flow" - but at their core, the basic modules were still short text based programs.
Some "programmers" might operate at an entirely graphical connect the lego blocks level, but sooner or later it comes down to 1s and 0s, and somewhere along that chain (several somewheres in modern practice) it will be represented in text based languages.
I think the graphic based systems haven't taken over for the heavy lifting because they're all too specialized, my thesis included. Most "real jobs" need more flexibility than the non-text based systems can provide. Having said that, I use 100% graphic based UI design, even though the tools translate to .xml for me, I almost never "get my hands dirty" on the text based code for my UIs, anymore.
Symbolic characters are on the decline. (Score:5, Informative)
>surviving languages use symbols representing sounds
over a billion people have a few symbols with you...
Are you referring to the Asian languages that use Chinese characters?
- Vietnamese used to be written in Chinese characters, it now uses the Latin alphabet.
- Korean replaced Chinese characters with the phonetic Hangul 500 years ago.
- Japanese has not one but two phonetic alphabets to go along with their Chinese characters. They mix all three together, and you can tell a passage is intended to be simple to understand when it will be all phonetic except the simplest of Chinese characters.
- Even China simplified the traditional characters because they were deemed too hard to learn. School children are taught new Chinese characters via pinyin, a phonetic scheme that uses Latin characters. Since they don't have a phonetic system, when they borrow foreign words then they match the foreign pronunciation with the set of Chinese characters that have the closest pronunciation. The result is a mix of characters where some have their original symbolic meaning, and others that only stand in for their pronunciation. Think "what your name means in Chinese" party trick.
- Typing Chinese characters usually means typing out the pronunciation and then selecting the character.
I think the point that symbolic characters are on the decline is very valid.