Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×

Is Assembly Programming Still Relevant, Today? 676

intelinsight asks: "Consider the following question given the current software development needs, and also the claims of the Assembly lovers for it being a language that gives one insights of the internal working of a computer. How relevant or useful is it to learn Assembly programming language in the current era? "
This discussion has been archived. No new comments can be posted.

Is Assembly Programming Still Relevant, Today?

Comments Filter:
  • Glad I did (Score:5, Interesting)

    by linuxwrangler ( 582055 ) on Thursday March 22, 2007 @07:39PM (#18451843)
    I'm glad I took assembly. I've never "used" it in the traditional sense of writing an application other than in school, but understanding how things work "under the covers" (whether at the CPU, hard-disk or network level) has provided valuable guidance in day-to-day design and troubleshooting.

    I've worked with people with very focused high-level programming skills and found that while they could write mostly decent code, their code was also most likely to fail in production since they were completely mentally removed from concepts like disk-seek times or bandwidth constraints. Programmers with a deeper understanding of what actually happened when their code ran tended to make wiser programming choices.
  • by tepples ( 727027 ) <tepplesNO@SPAMgmail.com> on Thursday March 22, 2007 @07:41PM (#18451875) Homepage Journal

    Isn't knowledge of assembly language for microprocessors required to create a higher level programming language?
    In theory, you could just make your higher-level programming language's compiler emit C or C-- [cminusminus.org] and then let the maintainers of GCC and other mid-level language compilers do the dirty work for you.
  • by softwaredoug ( 1075439 ) on Thursday March 22, 2007 @07:43PM (#18451897)
    One day won't there be little nanobots floating around with 512 bytes of memory and a 1 mhz processor that need to buzz around your body and eat up your precancerous cells? I imagine as things get smaller, the miniturization fronteir of computing nescesitates limitations in computing power and memory. This may necesitate a new generation of assembly programmers. Even today in the minituarization/embeddedness/realtimeyness world where many enjoy programming away in plain old C (like me) that knowing assembly is useful. First to look at the compiler's output and figure out what the hell its doing, second to just have a plain basic understanding (not necesarilly a detailed one) of what your C statements/operations/etc is probably turning into in assembly instructions.

    Another question, would assembly be more popular if it wasn't such a nightmare to write for Intel's x86 architecture? If we all had nice Motorolla PCs, would assembly be really cool?
  • Definitely (Score:2, Interesting)

    by SaidinUnleashed ( 797936 ) on Thursday March 22, 2007 @07:46PM (#18451943)
    I just got a SDK-86 to start learning it!
  • by Anonymous Coward on Thursday March 22, 2007 @07:54PM (#18452029)

    Learning assembly language is good because it gives you a better sense of processor architectures and what the compiler has to do. Think of it as similar to learning how to add and multiply by hand, even though you will probably use a calculator for most arithmetic in the future.

    Do not delude yourself into thinking assembly is the ultimate gateway to speed though. I would bet most of the people advocating assembly language programming to make your code go faster cannot write assembly better than today's optimizing compilers. Similarly, you probably won't be able to do better than the compiler either, with one major exception: Compilers do not generate effective SIMD code yet (gcc is slowly trying to add it, but it is pretty primitive). This is because popular programming languages do not express data-parallel algorithms in a way that makes it easy for compilers to spot them and generate the appropriate code. Many media libraries use assembly language primarily for this reason, and not because they are bad-ass programmers who can allocate registers better than gcc.

  • by oaklybonn ( 600250 ) on Thursday March 22, 2007 @08:10PM (#18452197)
    The best way is gcc --save-temps; write your c code and then understand what gets generated and how it differs depending on what you write in a language you're already familiar with.

    I spent 6 months recently working for a company that programmed mainly in C++ and visual basic (I'm a mac programmer, so I was a fish out of water there, which is ultimately why I left...) But the developers there didn't / couldn't / wouldn't understand assembler in any way shape or form. Without the high level debugger, they were lost. So when their app would crash, they'd be helpless to understand how.

    I was appalled. I've spent 20 years debugging crashes and even though I don't speak x86 fluently, I can at least find my way around it. How do people that aren't able to read assembler ever able to ship quality products?

    Of course, the best way to learn it is using an interactive assembler: go back in time to 1998 and use TMON Pro with MPW. With an 11 minute link time, you'll quickly learn that its easier to patch your bugs in assembler and continue execution, or write a little subroutine in "playmem" to do something, rather than terminate your app, make a small change, and relink.

    God I miss TMON.
  • Re:All's quiet (Score:5, Interesting)

    by Qwertie ( 797303 ) on Thursday March 22, 2007 @08:14PM (#18452259) Homepage

    Here are a few reasons you might need proficiency in assembly language:

    • You're writing software for a low-speed or low-memory chip for an embedded system (e.g. one of the PIC chips). Such chips are used either because they are cheap or because they need very little power. You can often program these chips in some variant of C, but if you need that last drop of performance, you use assembly.
    • You're writing a compiler. In this case you may not have to write assembly directly, but you'll have to understand it intimately in order to convert source code to machine language. (Replace "assembly" with "bytecode" or "IL", if making a Java or .NET compiler)
    • You are reverse-engineering closed-source software (another case where you must comprehend assembly)
    • You're designing or testing a computer chip, in which case you may have all sorts of tests cases written in assembly language.
    • You're maintaining an old "legacy" system that uses assembly.
    • You're writing an emulator for another computer, and you need high performance. In this case you may need to understand the assembly language of both the real and emulated machines, as I learned when I wrote a Super Nintendo emulator.
    • Those bastards make you study it in one of your college courses.
  • Yes! (Score:1, Interesting)

    by Anonymous Coward on Thursday March 22, 2007 @08:23PM (#18452369)
    I was asked this question by the new head of the Computer Engineering Department at my college, RIT. They were thinking of removing this requirement from the coursework. My point then (and now) is that (as others have pointed out), it's important to understand the Assembly Language of the processor you're working on, so as to aid in debugging, etc. As was pointed out, very little assembly "programming" is done today, only those items that are close to the hardware (drivers, Kernel stuff, etc.) In the past, I've actually found compiler bugs when looking at software in assembly, finding out how it screwed up a procedure prologue. Granted, today, compilers are MUCH more mature (thank you Compiler Gurus!). However, debuggers and other diagnostic tools still need to know/understand the underlying architecture and hence knowing the "raw" assembly code is a very handy thing. One area, as pointed out, is DSP's, these are still heavily programmed in Assembly, as that's the best way to squeeze every possible MIP out of the device. Most DSP's can juggle multiple things at once. I spent time in the early 90's working with TI C25 DSP's. Some very handy things you can do, but limited in other ways, in terms of RAM space, etc. (it was the wrong CPU for the job, but that's what the PHB's wanted to use, as it was cheap).
    As an Embedded Systems Engineer with 25 years experience, Assembly language is important to what I do, even though I don't use it all the time. I still enjoy the ability to turn on the "disassembly" mode of the debugger and single step through the problem.
  • Re:Yes. (Score:5, Interesting)

    by A nonymous Coward ( 7548 ) * on Thursday March 22, 2007 @08:24PM (#18452407)
    I know far too many programmers who haven't a clue what is going on under the hood, so to speak, and have zero comprehension of what operations take longer than others. For instance, consider a C programmer I know who thinks strcat is a good routine to use.

    (For the ignorant, it takes two string pointers and copies the second one to the end of the first one; this requires zipping all the way from the start to the end of the first string to find where said end is. It then helpfully returns the pointer to the beginning of the first string, the very parameter you passed in. Never mind that the new end of the first string would be very handy for the next strcat to that same string.)

    This programmer is generally good at what he does, but the idea that strcat is woefully inefficient is not obvious to him. Even after explaining it to him, yes he will avoid it, but he does not really understand why. He, and far too many other programmers, measure their program's "speed" in lines of code. Sure, they know that a subroutine call has to count those subroutine lines of codes as well, but they simply have no concept of the fact that some operations take longer than others, that there are better ways of doing simple things.

    I think every beginning programmer should have to spend a semester on some funky old z80, for instance, all in assembler, debugging in machine language, before they can call themselves a good programmer. The idea is not to get them skilled in z80s, but to give them a basic idea of how computers work.

    It's the equivalent of learning to drive a stick shift car without understand why there are gears at all. If you are ignorant of the very basics of internal combustion engines and can't understand the dangers of lugging or redlining an engine and the importance of using the right RPM, you will never be a good driver. It matters not whether you ever drive a stick in real life, it's just a matter of knowing how to handle your equipment.
  • by moocat2 ( 445256 ) on Thursday March 22, 2007 @08:36PM (#18452525)
    While I have not written any assembly since college, I am really glad I know it. As an engineer who works in C/C++, sometimes it is really helpful in debugging to see what is happening at the assembly level.

    So, I would definitely recommend at least being acquainted enough with assembly so that you can semi understand a listing.
  • by FrankSchwab ( 675585 ) on Thursday March 22, 2007 @08:38PM (#18452543) Journal
    We're in the process of doing a SOC ASIC, with a 32 bit CPU, analog sensing hardware, USB and other communications ports, sophisticated low power wakeup mechanisms, and RSA/AES/SHA-256 hardware. It only contains 32KB of ROM and 12KB of RAM. We expect the part to require less than 10ma of current in full-scale operation (generating 1 MB/sec of encrypted sensor data). We expect the parts to sell for less than $3, including several bits of external hardware, into a highly competitive marketplace. If our parts cost $0.10 more than our competitors or take 10 ma more current, we're out of business. We expect to sell millions of parts per month.

    ROM and RAM comprise the largest space on the die. Die cost is about linearly proportional to area - doubling the size of the die doubles the cost. As a result, we don't have the luxury of embedding Linux, throwing a couple more MB of RAM at the problem, or increasing the clock speed. We certainly don't have the luxury of throwing this weeks latest, greatest academic language at the problem. 'C' and Assembly is the only way this product is going to survive.

    I think you can be a fine Web programmer without knowing assembly or 'C'; I think you'd be a better one after one assignment to a project like mine, where every design decision is made and every line of code is written with a thought to "how fast is this going to run, how much code does this generate?", rather than "how do I get this done fastest and easiest?". There are many situations where the "throw more hardware at it" approach is valid; there are also many situations where it isn't.

  • Very important (Score:2, Interesting)

    by SpaghettiCoder ( 1073236 ) on Thursday March 22, 2007 @08:42PM (#18452599)
    You need to have assembly language, to write routines to directly access the CPU (perhaps to make a fast routine to do a simple operation in your program that gets called many times), address specific memory and input/output of various hardware such as your screen, your ports, everything.

    There are limitations to what high-level languages can do. When I started out on the Amstrad CPC, I remember the computer booted up into a very inefficient non-standardised BASIC, which had many added commands which were really functions to allow access to the unique hardware of the CPC (sound and graphics). These were ridiculously slow, with the Z80 processor and 32K memory available for programs. If you wanted to make a simple animation for example (say, a little 2D sprite walking across the screen), you needed to do it in Z80 assembly language IF you wanted the computer to do ANYTHING else at the same time (e.g. scan for keyboard interrupt, play sound chip in background, or even just a second sprite).

    x86 assembly language isn't hard, but it's the only way you're going to be able to play control freak with your PC hardware. Many excellent Linux applications are done almost 100% in assembly language, including the excellent SNES emulator ZSNES (which makes much the same demands on Linux as the old 8-bit Amstrad, and keeps perfect time). If speed is really important then there's no other way. For some reason I think secret organizations like GCHQ would employ skilled assembly programmers so they can keep looking for ways to brute force public-key encrypted messages in the shortest possible time.

    In my opinion, a lot of people are able to use a few libraries in VC++ or similar to be able to make useful programs, and also new things like Python, Perl, etc. but that doesn't necessarily make them a "coder" just because they're able to use some built-in functions. To be a "coder" means you PREFER to use plain C, and know assembly language (even though assembly language is different for every platform - an assembly language programmer knows exactly what info he needs to find out re opcodes and registers and memory addressing and interrupts, to program his program with the same techniques).

  • by starseeker ( 141897 ) on Thursday March 22, 2007 @08:43PM (#18452609) Homepage
    I have always wondered what would have happened if the idea of using Lisp as the assembly language of a machine had actually taken off. If I understand the Lisp machines correctly, they were actually "lisp on the metal". Given the flexibility and power of the lisp language it would have been very interesting to see what the evolution of the Lisp Machines would have been, had they proved viable in the long term.
  • by jd ( 1658 ) <imipak@ y a hoo.com> on Thursday March 22, 2007 @09:28PM (#18453079) Homepage Journal
    No. You only have to generate code for another already-existing compiler. For example, this is how people use f2c. You can also write compilers that generate bytecode (some LISP compilers do this, as do many Java compilers).

    Now, if you were to ask "isn't knowledge of assembly language for a given microprocessor required to create a compiler capable of directly generating native code?" then the answer would be yes, because all the other possibilities have been excluded. Alternatively, if you asked "isn't quality knowledge of assembly language for a given microprocessor required to create a compiler that can generate code that is compact, efficient on resources and fast?" then the answer would also be yes.

    However, as most modern programs are anything but compact, efficient on resources or fast, that is a rather moot point. The best compiler in the world can't turn junk into quality, although a trashy compiler can certainly turn quality into junk.

  • Re:All's quiet (Score:5, Interesting)

    by LiquidCoooled ( 634315 ) on Thursday March 22, 2007 @09:30PM (#18453101) Homepage Journal
    Knowing about optimising registers, partitioning the stack, minimising movs, and assembly tuning in general doesn't rely on the same concepts at all.

    The GP is 100% correct in its uses and you are also correct that its current use is crap.

    We have abstracted ourselves far enough away and insulated ourselves so much that I think we are in danger of losing the point of fast computers.

    Anyone with visual studio can get a good example of this if you see how long the immediate window takes to calculate 1+1.
    It might be great and super and empowers the developer to do more, but something has been lost that I feel Visual Basic classic is fast in comparison.

    Finding a decent optimisation of the core .net framework would be a major benefit and you cannot do that without an implicit understanding of assembler.

    Every time this kind of discussion comes up I think of Mel [pbm.com].
    Assemblers are a dying breed but their services are more than needed even today.
  • Re:easy as 1 2 3 (Score:4, Interesting)

    by FlyByPC ( 841016 ) on Thursday March 22, 2007 @09:38PM (#18453157) Homepage
    Not all of us. I'm majoring in EET/ComET at ODU (they didn't offer the pure EE via distance-learning) -- and we're required to learn PIC assembly at least.

    Granted, PICs are much MUCH simpler than anything running a modern OS -- but learning assembler, even on a simple device like a PIC, does teach a lot about how hardware and software integrate. Also, it's kind of cool to know that (for example) exactly 1,000,000 clock cycles after the program starts up, it will be calling *this* instruction, which moves the value in the accumulator to *that* register.

    I can't be the only one out there who finds this extremely refreshing after taking a course in Java (and learning about font objects, GridBagLayouts, and other things so far removed from "real" programming that they might as well call it a Fine Arts course), can I?

    Anyway, I wasn't really looking forward to learning Assembler -- until I got started and saw how powerful, elegant, and just plain beautiful it really is.

    PICs are cool toys -- 5MIPS ain't much compared to the latest and greatest Intel and AMD have to offer -- but when you consider that they'll run for days (weeks? months?) on a CR2032 cell, and cost under a buck apiece, they're very impressive. (Freescale MCUs, too -- although I don't yet know those quite as well.)
  • by Bozdune ( 68800 ) on Thursday March 22, 2007 @09:38PM (#18453159)
    How is someone going to learn assembly?

    Well, when I was a wee lad taking 6.251 from Donovan & Madnick at MIT, they threw an IBM Assembler H manual at us and wished us good luck on the first programming task (in two weeks).

    So suck it up and RTFM. Although, to be fair, the Assembler H manual is probably one of the finest computer manuals ever written, so it wasn't as bad as it sounds.
  • Re:All's quiet (Score:5, Interesting)

    by Austerity Empowers ( 669817 ) on Thursday March 22, 2007 @10:15PM (#18453511)
    You're writing software for a low-speed or low-memory chip for an embedded system (e.g. one of the PIC chips). Such chips are used either because they are cheap or because they need very little power. You can often program these chips in some variant of C, but if you need that last drop of performance, you use assembly.

    You're writing software for any chip, on any platform, that requires direct hardware level access, e.g. device drivers, boot code, or core-features. No machine, no matter how fast can be programmed exclusively in C. For example, in C you simply cannot a DCR on a PowerPC. You need a special instruction w/o a high-level language equivalent. You can't cast a pointer to a physical address, it is not mapped to physical memory. You also cannot enable, or disable instruction cache from any C function call. The list goes on. There are a number of places it is totally impossible to use a high level language to do things.

    There are a whole lot of these out there, in the consumer, enterprise, military, etc.
  • Re:Yes. (Score:3, Interesting)

    by grasshoppa ( 657393 ) on Thursday March 22, 2007 @10:25PM (#18453607) Homepage
    You are both right and wrong, in a sense. You are right that he should know what's going on under the hood. You are wrong in that he shouldn't use it. Program execution speed should ( almost always ) take a backseat to code readability. If the organization uses strcat, there is a very strong and valid argument why he should still be using it.
  • Re:All's quiet (Score:1, Interesting)

    by Anonymous Coward on Thursday March 22, 2007 @10:26PM (#18453611)
    * Optimized code. MPEG4, MPEG4, etc. loops for mplayer etc. use hand-optimized assembly heavily. MP3 playback typically does too.
  • More (Score:3, Interesting)

    by DeadCatX2 ( 950953 ) on Thursday March 22, 2007 @10:35PM (#18453701) Journal
    • Designing a microprocessor. This is the worst, because you have to understand the entire architecture, what kinds of operands are possible, and you must manually enumerate opcodes and such. Essentially, creating your own assembly language.
    • You want to take advantage of processor-specific instructions which are not in a high level language (except as intrinsics). Things like CPUID, various SSE instructions, etc.
  • Re:All's quiet (Score:2, Interesting)

    by Short Circuit ( 52384 ) * <mikemol@gmail.com> on Thursday March 22, 2007 @10:41PM (#18453745) Homepage Journal
    I've found myself in need of CPU burn-in software. Tools I've found apparently weren't designed for multi-core systems, and that's the kind of system I need to push.

    What I'd like to do is write a bunch of assembler routines that repeat different classes of instructions. One would run simple FPU operations several hundred times, another would run integer ops, another, logic ops, more would run mmx ops, sse ops, sse2 ops, sse3 ops, etc.

    The program would poll the CPU temperature every couple seconds, find the routine that causes the greatest amount of heat, and concentrate on it.

    The overall program would be a C/assembler hybrid. The burn routines would be in assembler, but the analysis and scheduling routines would be in C.
  • Re:Yes. (Score:2, Interesting)

    by that this is not und ( 1026860 ) on Friday March 23, 2007 @12:11AM (#18454509)
    I learned programming at a 'professional' level by being thrown at a task and challanged to get it done. It was a stand alone medical device using a Hitachi 4-bit embedded controller. You learn fast that your code starts at the reset vector and you have to set up and intialize ANY resources your design will be using. A few years in at that job, the company brought online an older experience software engineer, but one who came from 'application' level programming. He knew a lot about what he was doing, but he was horribly reckless about initiating timers, critical configuration registers, etc., because he came from a world where that was all done for you by the operating system.

    Right now I just wish Microchip would do a better job of documenting the resources in their little PIC chips. It's tedious on each new model I decide to code to have to dig deep into all the arcana of the datasheet to figure out which registers to poke something in to to get the chip up and running what I need. They're certainly better, though, than some of the Japanese vendors, whose English manuals are bad translations of the Japanese originals.

    The first word on the first line of the table of contents on one manual I remember having to rely on was misspelled:

    "Beneral Introduction"

    It didn't inspire confidence in the rest of the manual.
  • Re:Yes. (Score:3, Interesting)

    by lordmatthias215 ( 919632 ) on Friday March 23, 2007 @12:52AM (#18454759)
    I agree that console programmers probably aren't pushing their hardware as hard as possible- unlike PC programmers, they have the luxury of having the same equipment in every unit, so they really don't have an excuse not to learn the system's assembly at least enough to squeeze a few extra cycles out.

    Out of curiosity (i'm not trying to troll here), what about the PS3 makes programmers understand the hardware better than they need to understand the 360 or Wii hardware? I know they have the Cell Processor, but IIRC they're just writing to Sony-provided APIs that thread the processes automatically.

    I'd actually expect programmers would have to know the Wii hardware much more inimately in order to not only obtain more finesse with gesture-based controls, but also squeeze the Wii for all it's worth to produce graphics that can garner people's attention once the 360 and PS3 start pumping out even more advanced graphics. In fact I think that if programmers don't invest the time needed to learn the Wii completely (even though it's very similar to the gamecube in architecture), the system may not be able to hold its momentum for too long. Of course Nintendo may have had this in mind, and will introduce another gen before the normal 5-year cycle is over.
  • Re:All's quiet (Score:2, Interesting)

    by kyashan ( 919683 ) <dpasca@gmail.com> on Friday March 23, 2007 @02:56AM (#18455283) Homepage
    Assembly for shaders isn't written in stone. Graphics hardware is moving so fast that the concept of a fixed assembly for them is limiting. It's limiting for the hardware makers, limiting for the software developers.
    One can always squeeze that cycle somewhere, but it doesn't make sense. The problem with shaders are different ones, like integrating them with CPU code and balancing between dynamic branching and code permutations (combinations of features in a shader are selected either with an IF or with a bunch of includes.. so to speak).

    Knowing assembly for shaders is definitely not worth it in my opinion. I highly doubt anyone uses it in the game industry.
    LUTs are very much used for common techniques such as PCF shadow mapping and what not in Cg and HLSL.
  • by Alioth ( 221270 ) <no@spam> on Friday March 23, 2007 @06:46AM (#18456291) Journal
    I'm building my own computer (no, not getting some random PCI cards and plugging them into a motherboard), but designing a simple Z80 system for fun.

    If you want to mess around with this sort of thing, you cannot avoid writing things in asm. I've got this far:

    http://www.alioth.net/Projects/Z80/Z80-Project/Z80 -Project-Pages/Image19.html [alioth.net]

    - having laid out a double sided PCB, and got everything shoehorned onto a 160x100mm 'Eurocard' sized motherboard.

    However, I've also retargeted the z88dk (Z88 Development Kit, originally designed for the Cambridge Z88 portable computer) to my Z80 board because while it'll be best to have all the low level stuff done in assembly language, writing things that use floating point will just be ten times faster to write in C.

    But even if you never intend to hack hardware, it's still important to at least be familiar with assembly language - if only to know why unchecked buffers are bad. If you've ever written a program in asm and accidentally overwritten the stack and tromped all over your return address, you fundamentally understand why this is a bad thing. We've got into a whole world of hurt because many programmers didn't understand this.
  • Re:All's quiet (Score:2, Interesting)

    by Hal_Porter ( 817932 ) on Friday March 23, 2007 @06:49AM (#18456299)
    If you use Win32, you can use InterlockedIncrement [microsoft.com]( &counter );
  • by burnttoy ( 754394 ) on Friday March 23, 2007 @07:16AM (#18456433) Homepage Journal
    OK, First of all I'll blow my own trumpet. Over the last 20 years or so I've programmed 6502, Z80, x86 (16/32/MMX/SSE/SSE2), ARM and various proprietory SIMD & RISC machines and pseudo-MIMD machines. TBH the payoff for these skills simply isn't worth it.

    As an asm coder I _may_ find a full time job but asm will take as little as 10% of my time. Contract asm work is out of the question and I haven't seen any in years (since I wrote a serial port driver for Win3.1). I actually like programming in assembler but for the sake of my pay packet and career I have reskilled in PHP, MySQL, CSS, XHTML, JavaScript etc simply because I can find contract work that pays well. Something that appears unachievable with asm. Maybe this is why we are a dying breed.

    Lastly, you're right. This discussion crops up so frequently on BBS's, Usenet etc. It seems that the answer must be that asm coders are still needed and asm is still relevant! If they weren't why would we be discussing its relevance!

    Incidentally, if anyone would like to hire an asm coder who like asm mail asm@burnttoys.net ;-)

I tell them to turn to the study of mathematics, for it is only there that they might escape the lusts of the flesh. -- Thomas Mann, "The Magic Mountain"

Working...