Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Programming IT Technology

Why Not Ada? 81

David M. Press asks: "Why is it that Ada 95 is not in greater use in Open Source projects? It can compile to machine code or java bytecode, it has built-in tasking (threading), it supports both the procedural and object-oriented paradigms, and it is very human-readable (based off of pascal syntax). And best of all, the GNAT project (based off of GCC) provides a fully conformant Ada 95 compiler for multiple platforms. What more can you ask for?" Good point. Is Ada just a victim of obscurity or are there real reasons not to use it?
This discussion has been archived. No new comments can be posted.

Why Not Ada?

Comments Filter:
  • by Anonymous Coward
    Ada was designed for different needs than most programmers have. I found it cumbersome compared to other languages. It would just be another way to spread developer skills too thin. I am sure there are projects where it would be great - code running a nuke plant for example. But, I already have to use Transact-SQL, PL/SQL, Java, ASP, JSP, Javascript, HTML, NT tools, unix shell commands and Perl just to make it through the week. Why make it worse?
  • As sschwarm said, there are two types of exceptions: those that signify programming errors, and those that signify external failures. In the first case, exceptions are useful because they are a much cleaner way to represent errors. Consider your square root example. If someone passes a negative value, the program has a bug. (Say we are coding in Java, which doesn't have unsigned types) How can the coder who wrote the sqrt function know what the right thing to do is? Obviously, he can't. The solution that you proposed, saying that sqrt(-9) = 3, doesn't help the programmer find the bug. C is much worse, where functions' semantics are clouded by the type of fix you present. (example: what does realloc(NULL,23) mean?) In the second case, exceptions are useful because they let you separate the code that generates the error with the code that deals with the error. When your TCP connection dies, your TCPConnection class methods can throw a ConnectionLostException, and then the UI can catch that exception and deal with it appropriately. The alternatives are to have all function calls return their error status (which obfuscates the code), or to have a global errno variable (which doesn't work for threaded apps, and is a bad design anyway. e.g. Unix signal handlers may have to save and restore errno. Finally, your argument is that exceptions are unneccessary, so they shouldn't be used. Don't forget that all of the following items are unnecessary: classes, while loops, for loops, functions, pointers. Obviously there are other important criteria.
  • Sorry, but if the program was written sensibly in the first place, there would BE no exceptions.

    What, are you crazy? What on earth should I do if, say, a database connection fails because a process on another machine has failed? Either raise an exception, or just make up the record set from /dev/random?

  • Oh dear, language wars. You have pretty much summed up the view I was trying to express (rather than the ones you have inaccurately inferred):

    > Almost nobody writes Ada for fun.

    My point: if you enjoy programming, you'll probably find Ada extremely tiresome. Of course you can write great C and bad Ada. But I suspect many who program for kicks would enjoy writing bad C more than good Ada (providing the rest of us don't have to use it, of course!).

    I don't dispute the value of an engineering approach to software, if only it was a true approach rather than the unpragmatic dogma favoured by academia or the lip service paid by industry. I certainly don't dispute it when flying (although I get nervous in those A320s).

    Ade_
    /
  • Yes, Ada is very productive if you're a university wishing to churn out carefully indoctrinated "software engineers"...

    When I did my degree, Ada was used as a teaching language for almost the entire three years (with some diversions: ML as an example of a functional language, M68k assembler to deter us from writing in assembler ;-). And yes, it did seem verbose, cumbersome and dull - which was also what individuals became if they programmed in it for too long. It didn't help that the lecturers never saw fit to teach us about the compiler environment - make tools and so forth - that might have made development less painful. I don't know about GNAT, but the Ada compiler that used Ada syntax in its command language sucked hard.

    Anyway, the only programming projects I really enjoyed were the C ones in the third year (C was covered almost in passing within a module seemingly provided as a concession to those in the department who claimed that graduates needed C skills for the job market). I think it's true that Ada partly suffered from a lack of bindings to other libraries, which made C projects look more interesting and versatile. (We used an Ada curses binding once...yuk.)

    Since then, my old CS dept has swapped to Java as a teaching language, which has the significant advantage of being more vocational. I doubt many will miss Ada.

    Ade_
    /
  • Nuke plants don't run on software, they're designed to be intrinsically safe and use ultra simple and reliable analogue protection systems.
  • unless you work for Rockwell


    I think that's his plan, funnel them straight down to Cedar Rapids.


    The joke when he came in and switched the freshment to Ada was "How much does it cost to buy a computer science department.


    Glad I got to take C...

  • As a alumn of this illustrious college, I believe that the joke was:

    When John Deere came in and gave the department all new PC's (with the John Deere Logo on it of course), it was questioned, "How many computers does it take to buy a Computer Science Department."

    I my opinion, ADA is about as worthless as Visual Basic or Latin. Mumps was more elegant than Ada.
  • Ada is very popular with companies that write software for avionics, both civilian and military, and medical devices. I've heard that Boeing uses it in all their new aircraft.
  • That and it's named after a girl.


    This doesn't seemed to have harmed Perl. :)

    (Yes, I do know its spelled differently for you spelling weenies out there).

  • I took ADA from McCormick before he left my old university. Is he still playing with his model trains, doing real-time programming for them with ADA? I feel your pain with the structures, though. Don't worry, after school you won't ever use the language again (unless you work for Rockwell or the military).
  • Interresting. Does anyone have more detail about this compiler. I remember hearing, all through college (through about 88) that there were no approved/validated/certified Ada compilers. What platform did the 83 platform run on?

    Adrian
  • I was dragged kicking and screaming into my first Ada project. I would have prefered almost anything else. Having programmed in C and then C++ for 12 years I argued vehemently for C++, and lost.

    6 months into the project I suddenly found that the awkwardness of a strongly typed language had become an asset. I was writing code just as quickly as anything else I'd ever done, and it WORKED! I no longer had to deal with memory stomping problems or unexpected behavior because I accidentally used = instead of ==. The only problems I was dealing with were logic problems. So if the logic of my solution was correct and I correctly implemented that logic, I was done. On C and (especially) C++ projects, this phase is called integration 8-() and lasts forever. Java is much better but has its own drawbacks since it is still interpreted.

    The bottom line is that we produced rock solid software in somewhat less time than we would have with C++ or its ilk. I've since been on several C++ projects and they have only strengthened my opinion. One project involved an embedded system written in C++ which could only run 26 minutes before it crashed. It was fielded since the typical runtime duration for this system was 20 minutes (only because they couldn't go longer without embarassment). A similar project written in Ada (not by me) has been working beautifully for 5+ years, but the Ada is bad crowd managed to get the new system written in a much better language with laughable results.

    It is unfortunate that the free software culture shuns good software engineering practices and languages that support them. Free software is good because 1) most of it is NOT written in C++; 2) Some very talented people have made major contributions; 3) most importantly, it undergoes rigorous adhoc testing by the community at large (one of the first things to be scrapped in the commercial world is good testing, aka MS).

    I don't have any solution for this, although I understand that Ada is being used to teach programming in universities more because it has been tied to government money. Maybe that will work.

  • Ada was developed by the government, for military applications. Since it's scope was so specialized - in that the only places that wanted you to know it was military, finding an ada programmer is hard. So hard infact that the government is abondoning it infavor of C/C++. So even the founders have given up on it. But I've wondered the same thing.
  • (display "hello world\n")
  • There's nothing wrong with a language being named after a girl...just that Ada is named after a rather messed up girl. Allow me to loosely paraphrase from a text I found in my /info where my prof kept all the stuff from the book.

    Important Disclaimer
    Warning, the following loose paraphrase reads like the "Killer the Dog" monologue in Half Baked. The author would like to publicly state that he is not baked..just feeling the effects of sublingually ingested Penguin Mints [thinkgeek.com]

    The Story of Ada, as told by some file I found with my GNAT compiler

    "Ada was Lord Byron's daughter. It is practically the law to mention this point in any discussion having a word with the letters 'ada' in it, whether or not the disussion is related to programming. She was his only legitimate daughter. Previously, he was married to his half-sister and had a kid or two. (I'd go re-read all this, but I don't want to log into my filesys on account of my laziness).

    So anyway, he was the only legitimate daughter of this guy Byron (who left his sister and married someone else, hense her legitimate status).

    She worked with Charles Babbage...she was the first programmer. (This line inserted in accordance with the Ada Detail Discussion Act). She worked alot...eventually she got some kind of digestive/respiratory problem.

    They gave her drugs.

    Literally, lots and lots of drugs. (Begin the freaky part...I'm not making any of this up). Using the finest "modern scientifik resoning methods" of the day, they treated her with a mixture of brandy, morphine, and heroine, and some other stuff. Needless to say, she became quite addicted to this stuff.

    Also, she went insane (echo: insaaane). She believed she could communicate and do math with God. Eventually, she figured out that the drugs were messing with her, and quit cold turkey. To dull the pain of her many and various kinds of simultanious withdrawl, she took up gambling on horse racing. She died in debt.
    .steady now
    ..
    ...wait for it
    ....
    ...........oh yeah....and they named a language after her."

    RE: Perl...yeah...perl was named after that completely messed up kid in that hateful book: The Scarlet Letter

    To the great and wise moderators: yeah, this is mad out of date, but I'm being force-fed Ada95 and have been waiting for an opportunity to vent in a public forum.

  • Actually, I'm an Electrical and Computer Engineering major (hardware, not software), and further, I'm supposed to be in the Fortran class, but was misadvised and wound up in Ada with the CS majors. Being a hardware major, I only get a little programming. The irony of all this is, one of the things I'm considering doing with this hardware education of mine is going into military-type work where they use Ada. Go figure.
  • it is very human-readable (based off of pascal syntax).

    Personally, I think that Ada is not particularly human-readable, just wordy (like Pascal or (IMHO) Java). Well written C or C++ is (or at least should be) quite readable (to someone who knows C and C++, of course).

    Also, as others have pointed out, relatively few libraries exist for Ada (either written in Ada or bindings for C/C++ libs). Which means you have to either create a binding (hard without support from something like SWIG), write it yourself, or do without. None of which are particularly appealing to most people.

    Also, Ada is rarely (if ever) taught. C, C++, and Java are the languages of choice in education and industry, so those are the ones people are going to learn. If Ada took off and everyone was doing it, people would start learning it: few people are going to learn a language they can't really use for anything. But who is going to start some big important project and do it in a language nobody knows? So it's a bit of a catch 22.
  • The statement that "Ada is rarely(if ever) taught" is patently untrue. Scores of universities teach Ada as both a foundational language and in advanced courses.

    OK, I stand corrected. However, at the colleges/universities where I have taken CS classes (University of Oregon, Oregon State, a community college in Oregon, and Johns Hopkins), they did not have any Ada classes. At JHU, C, C++, Java, Perl, Lisp, and ML are used (C++ and Java are the only ones taught in early classes, the others you just have to pick up).

    I know more people who write stuff in Forth (1), Objective C (3 or 4) and PostScript (2), than Ada (0). In fact AFAIK I don't know anyone who even knows Ada.
  • I know more people who write stuff in Ada (dozens) than Forth(0), Objective C(0), Postscript(0), Eiffel(0), or Smalltalk(0) :-)

    Fair enough. :)
  • Ada.Text_IO.Put_Line(Item => "Hello, World!");

    actually, if you use text_io; then you could go ahead and do:

    Put_line("hello, world");

    when making a function call, you can either be very clear about what you're doing with the little => (which we started calling a 'chun', after our lab TA), or you can just pass in vars in order.

    ---

  • I like Ada.

    At my university, we teach Ada. It is a great language. It was written especially for embedded systems. To Ada, programming a VCR or a car is the same as a computer.

    Ada was not written to prevent dumb errors. Ada is strongly typed. This is NOT cumbersome. Using the data type that you mean to use is just GOOD SOFTWARE ENGINEERING. You can always typecast if you want...

    Ada is a powerful language. I would like to see it used more.
  • There are two principles reasons why Ada has failed as a language. (And I say "failed" because it has failed to achieve widespread acceptance; it's used, for the most part, only where it is REQUIRED to be used.) 1. Lack of unified design vision. Most of the successful s/w projects come from a very small team -- often a single individual -- who impose a single world view on their creation. Examples Unix, C, perl, sendmail. Not that these are perfect -- that's not the point. The point *is* that they're hugely successful and part of that is due to reasonable adherence to a relatively tight set of design goals. 2. Creeping featurism. Ada is an incredibly bloated language: it has way too many features, it requires way too much code to perform simple tasks; even its source code is verbose and unwieldy. This is probably a direct consequence of its origins, specifically the design-by-committee approach. ("If anyone wants this, we'll include it.") As a result, it's a huge language in every sense of "huge" -- which makes things more difficult, not easier. I recall having a similar discussion in the mid-80's with someone who insisted that Ada would displace C (and the then-new C++ and objective C). Clearly, the exact opposite has happened -- and will happen again should anyone decide to repeat the Ada design process.
  • I hardly think it's a "misconception"; I think it's quite clear that Ada is the product of a *long* series of committees charged with defining the standard. And like anything subject to this sort of process, it reflects its origins. This isn't to say that there weren't some talented people involved, or that some of them didn't have cohesive visions for the language. They were. They did. But ANYTHING that gets pushed through a committee process this lengthy, and with as much vested interest (from the US government procurement angle) is bound to come out fairly warped. Could it have been different? Maybe. But it wasn't, and as a result, Ada is well on its way to being a relic -- an interesting relic, but not one of ongoing interest to those on the leading edge.
  • At my college we learn Ada first because we have one of the inventors of Ada on our CS faculty. I must say though that it has some definite benefits as a teaching language. The compiler finds a lot of stuff for that comes to bite you in the ass later (during execution) in C. The reason it is a pain in the ass is that it will also give you errors for having the incorrect number of spaces of indentation in a block. Pointers are also much more of an ordeal in Ada than in C. It also takes care of a lot of the passing by reference vs. passing by value stuff that is such a pain in C. All in all I have to say I actually prefer Ada because it makes post compilation debugging so much easier than C.
  • >This leads to ALL UPPER CASE PROGRAMMING

    Bzzzzt. Sorry. Not that I've ever seen. There may be shops that do that (God knows why), but the language reference manual and "official" style guides are filled with words and examples about capitalization as an aid to readability.

    Ada is case-insensitive, perhaps due to this requirement. There are arguments on both sides of that issue, and I can't find myself getting passionate about either side -- this being /., no doubt others will supply this momentary lapse.

    There was only one Ada horror from its requirement to be usable internationally and on older machines: "not equal" is written "/=" because some terminal on some machine didn't have a "!".

    >Objects came late to Ada.

    I really believe that Ada 95 failed to capture a large mindshare for no other reason than because its creators refused to use the simple word "class" and used a weird syntax of "tagged types" and so on to describe what was:

    • a difficult concept to begin with, and
    • a concept that other people were already describing with other words.
    No doubt some CS PhD had good reasons why "tagged type" and the rest were "better" words. I'm reminded of the couplet to the pigheaded driver:
    He was right, dead right, as he sped along

    But he's just as dead as if he were wrong
  • >Ada was designed, in part, to help keep the programmer from making some kinds of mistakes.

    Absolutely true. They realized, to take one small example, that even though programmers should do array bounds checking, in the real world they don't. So they built that into the compiler. I've been called out in the middle of the night on a project in FORTRAN to fix what turned out to be somebody else's code running of the end of an array into where my code was running, so I'm grateful to the Ada designers for that.

    >My feeling is that this philosophical influence on a language tends to lead to a more cumbersome language.

    I don't know what "cumbersome" means. If it means that it's more difficult to get something out of the compiler than it is in C, I agree. But then my practice in C and C++ is often to compile and link and immediately fire up the debugger without even bothering to see if it runs, because I know it won't. By contrast, if code got out of an Ada compiler, you knew it would probably run and maybe even do the right thing when it ran.

    >Indeed, Ada is a complex enough language that it took years to create a certified version of the compiler.

    The word is "validated", and I don't recall it taking years.

    >NOBODY wanted to use Ada.

    Young military officers figured out pretty quickly that:

    • they wouldn't be in the service forever, and
    • there were more C++ jobs than Ada jobs in the civilian world.
    As a result, they raised hell about the Ada requirement, and program offices were issuing waivers for the equivalent of a case of the vapours.
  • >Yes, Ada is very productive if you're a university wishing to churn out carefully indoctrinated "software engineers"...

    I was kind of hoping you'd follow up on that. I infer from the way that's said that you probably believe that an engineering approach to software isn't of much value to you.

    You're certainly entitled to that opinion, if it is your opinion. But I hope I'm similarly entitled to being proud of what I believe is a more fully scoped professionalism.

    I do take issue with Ada being mostly academic. My experience with it has been very much in the real world. And I think in academia, the flavor of the month is either C++ or Java.

    I just escaped a C++ project at work that was being run by free spirited OO bigots from academia who had never delivered more than a few hundred lines of code on time and on budget. Needless to say, it was a disaster. I'll take professionals over amateurs every time.

    This is absolutely not an Ada vs. C++ issue. It's possible to have a bad design and irresponsible practices in any language.

    >And yes, it did seem verbose, cumbersome and dull - which was also what individuals became if they programmed in it for too long.

    Sorry for such an obvious response, but I'd much rather have the person flying the plane I'm on be a little verbose, cumbersome, and dull.

    Ada doesn't have hack nature. Almost nobody writes Ada for fun. People who write Ada usually do it for a living and usually do it in a large project. I'd almost be willing to assert that for quick hacks using Ada is a stupid idea. There are times when you need a fondue fork and times when you use an air hammer.

  • Man, I don't know where to begin on this one. Virtually all of the technical information in it is false. Let's take them in order:
    1. Ada doesn't permit dialects and subsets. For a long while you couldn't even use the name "Ada" on a compiler unless it implemented the full language. AFAIK there has never been an Ada compiler that didn't implement the full language.
    2. Ada compilers nowadays aren't any slower than C++ compilers, since they use the same back end. Even in the early days, only a handful of Ada compilers were really slow.
    3. Perfectly true, and the DOD didn't promote it all that much either. I personally disagree that the DOD "endorsement" was that much of a negative.
    4. The machine code produced by most Ada 83 compilers is so good that there have been frequent studies showing it to be faster than the output of C compilers for the same platform. With gnat, as you might expect, there isn't much difference since it uses the gcc back end.
    5. It's called gnat or perhaps GNAT. It's quite true that it was a stupid idea to make US$20,000 Ada compilers at the very time that Sun and SGI were giving away C compilers as standard equipment in their OSs. COBOL is so far from dead there have been frequent jokes about resurrecting COBOL programmers for the Y10K problem.
    6. "Interesting" is unmeasurable. We differ.
    7. I think you misunderstand exceptions. By keeping the exception control flow path separate from the normal control flow path, you always make your program clearer and generally make it faster as well. For a somewhat unfair example, see gzip.c
  • One thing that's come up in several threads is that Ada is a software engineering language, not a hackish language. I think that's absolutely correct.

    I first hired on with my present employer when a large project using FORTRAN was in its integration phase. At the time I hired on, the project had gone six months past the delivery date, and we finally delivered some eighteen months after that.

    Being fairly perceptive, I got off that project as quickly as I could and onto a project in Ada that used a software engineering approach, with strict rules about control flow and data flow, encapsulation, and intercommunication between elements. Not that I knew much of that, or even what that was all about, at the time. I was just glad to get off that first project.

    The Ada project not only delivered on time, but actually under-ran the hours allocated for integration, even when we had to separate the code into two separate processes and put it on a multi-processor system.

    Since then I've worked on projects in Ada, FORTRAN, C and C++, and found that while software engineering discipline could be used on a project in any language, in general Ada seemed to support it better. Whether this was due to the language itself or whether project managers who believe in and understand software engineering happen to pick languages like Ada, I don't know.

    I do know one thing: every Ada project I've been on has taken a smaller time for integration than any project I've been on in other languages. This includes one project where four different contractors developed software in three different cities, and when it all went together the first time, it ran.

    I've also noticed that in other languages, particularly C++ in the wrong hands, changes made to one part of the system have a far greater tendency to break other people's code than they do in Ada.

    Finally, I've noticed that some of the time savings from integration is spent at coding time cursing the compiler and the language for not letting you get away with stuff. But not all the time savings. And it's a lot easier to fix compile time errors than runtime errors.

    I don't write Ada for fun. I don't think many people do. And since Ada work has dried up, I don't write it much for a living any more either. But if I had a big project with a tight schedule that we needed to make money on, I'd certainly push to use Ada on it.

  • Strange. Whitespace and indentation don't matter in Ada any more than they do in C.

    I'll bet the reason your system was set up to ding you on indentation was to make life easier for the TA who had to grade your projects.

    On making post-compilation debugging easier, much of the time Ada makes it nonexistent! It's reasonable to expect that if an Ada program compiles, it runs, and runs correctly. CS students and other beginners, who show much more creativity in misunderstanding a problem than folks with more experience, may be an exception to this rule, but not by much.

    Back in the day, many platforms didn't even have Ada debuggers, and while it'd be hyperbole to say we didn't miss them, we got to the end of the project without too many suicides or homicides.

  • >its blocks are delmited by indentation. Not exactly. Its blocks are delimited by words, not symbols. Otherwise, it's every bit as free-form as C, etc:

    with Ada.Text_IO; procedure Hello is begin Ada.Text_IO.PutLine("Hello, world!"); end;

  • I have used Ada for about 10y. From the comments I've read and heard over the years, I'll make one observation...A persons dislike of Ada (Not ADA, It's a name not an acronym) is in direct relation to the amount of incorrect information the person has about Ada.

    IMHO, Ada is superior to C/C++/Java when you care _how_ the final product works.

  • I'm a CS student at CU Boulder. I'm lucky enough to get to choose my language for almost any of the projects I work on. Of course, the instructors try to give us hints on what to use. Whithout anyone asking about Ada, almost every instructor chooses to complain about it at some point. What am I as a student to do? Should I learn it in favor of tons of other languages that are recomended, popular, used in the real world, and that employers want me to know?
  • What about disk full or you miss typed the filename so it is not present. These are exceptional conditions. It is important to note that all exceptions are not errors in the program but just unusually occurences.
  • 1) It is not that it is complex it is more that it is precise. There is an extensive validation suite and much effort is spent in the standard to cover the details.

    2) GNAT compiles as fast as gcc. This was true but no longer.

    3) Ther are group that try. SIGAda of the ACM is a good example.

    4) GNAT uses much of the optimizations that gcc uses. What I have found is that is generates better code because you can express the problem better

    5) I agree. This was due in my opinion to the early mandate by the DoD. There were some not so expensive compiler around like R&R. They cost in the same ball park as a pro version of a C++ compiler.

    6) How about the control system for the Boeing 777? In my opinion, some companies are keeping it quiet that they are using languages like Ada because of the advantage they have.

    7) (lastly) I have TOO many bad memories of trying to debug code written in C where the status returns are not checked. Or trying to do it correctly and doubling the code. The best example I can think of is the use of some of the logical operation in the POSIX standard where one line of Ada and a 3 line exception handler needed to be replaced with 4 lines of code for each term in the expression. Or how about concatinating string and overflowing the array. ...
  • Oh, by the way, one more thing...

    Name one C++ compiler that is certified ANSI/ISO C++ compliant.

    (Hint: Take a look at http://www.eds-conform.com/ValProdList.html. EDS is the company chartered by the US National Institute of Standards and Technology (NIST) for performing programming language conformance assessments.)
  • It did not "[take] years to create a certified version of the compiler."

    ANSI/MIL-STD 1815A published -- FEB 83
    First 1815A validation -- APR 83

    (source: http://www.adaic.org/pol-hist/history/holwg-93/8.h tm#milestones)
  • I hope this was a sarcastic comment :-)

    Ada operates in at least following nuclear facilities:
    Doel nuclear power plant, Belgium
    Hinkley Nuclear Power Station, Somerset, England
    US Dept. of Energy National Ignition Facility
    Westinghouse Czech Nuclear Shutdown System

    (source: http://www.seas.gwu.edu/~mfeldman/ada-project-summ ary.html)
  • The statement that "Ada is rarely(if ever) taught" is patently untrue. Scores of universities teach Ada as both a foundational language and in advanced courses. See http://www.seas.gwu.edu/~mfeldman/ada-foundation.h tml.
  • Well, I do enjoy programming...in Ada...and I certainly do NOT find it tiresome! :-)

    The idea that someone would enjoy writing "bad C more than good Ada" just boggles my mind. I want my software to work, and work well, with me not having to futz around a lot to get it to that point.

    Ada catches a lot of programming errors at compile time, which eliminates a whole class of errors. (BTW, the assertion is sometimes made that if a program written in Ada compiles, it'll probably work. Well, that's bogus, correct compilation of a program written in a strongly typed language simply increases the likelihood of it working more quickly because of the up-front reduction of type and interface mismatch errors.)

    Once all the code compiles and links, the run-time system instantly catches additional bugs, such as out-of-range values, null pointers, array overruns, etc., similarly to the way the JVM monitors a Java program's execution. The Ada run-time simply provides more extensive checks (which of course can be disabled as needed).

    The end result is that I can churn out working code very fast, code that takes full advantage of Ada's built-in object oriented constructs, generic templates, concurrency, distributed execution, and that works reliably. This way I avoid having to go back and fix badly written code again and again as I add more features and capabilities, instead I get to keep breaking new ground, which I find...enjoyable!

    Marc
  • The Ada programming languages has no requirement regarding the number of spaces of indentation in a block. The compiler you are likely using, GNAT, has options that enable various style checks such as the one you cite, and the use of which appears to have been mandated by your instructor.

    Yes, pointers are more of an ordeal in Ada because pointers are a very powerful feature that are a major source of bugs. Ada attempts to flush out such bugs earlier by forcing the developer to understand what they expect to do with the pointer, specify it accordingly, and then enforce those expectations.
  • There is a popular misconception that Ada was "designed by committee". This totally false. Just as C had K&R, C++ had Stroustrop, Eiffel had Meyer, Ada 83 had Jean Ichbian, and Ada 95 had Tucker Taft.

    These language architects had final say over the architecture of their languages, and worked to ensure a consistent balance of features, clarity, and useability.

    It would be interesting to see a list of the "too many features" that have made Ada a "bloated language".

    Packages (comparable to namespaces)?
    Generics (templates)?
    Tasking (threads)?
    Object oriented constructs?
    What "simple task" does it require "way too much code to perform"?

    Let's compare strings:
    C:

    if (strcmp(str1, str2) == 0) {
    /* Do something */

    }

    Ada:

    if str1 = str2 then
    -- Do something

    end if;
  • Actually, Ada has had a stable definition of packages and generics (namespaces and templates) since 1983. Ada 95 extended their capabilities a bit, but the original functionality was not altered.

    Tasking, by the way, which is Ada's fully integrated concurrency construct, has also existed in full-form since 1983. A lightweight additional construct, "protected types", was added in Ada 95.
  • I know more people who write stuff in Forth(1), Objective C(3 or 4) and PostScript(2), than Ada (0).

    I know more people who write stuff in Ada (dozens) than Forth(0), Objective C(0), Postscript(0), Eiffel(0), or Smalltalk(0) :-)
  • Ok, name one commercial ADA compiler that implements the FULL specification. And by full, I don't mean "just the essentials". I -mean- the full specification. Every type, every run-time binding, every instruction, every form of inheritance, -everything-.

    Okay. Ada Core Technology's GNAT. Implements the core language and all Annexes.

    It's Open Source, get yerself a ready-to-install version for Linux from http://www.gnuada.org/alt.html and check it out!
  • The first Ada validation certificate was earned by "Ada/Ed", which in truth was an Ada interpreter...and verrrrry sloooooowwwwww.

    The first actual compiler validations were for a Rolm/Data General box and a Western Digital system.

    In 1984, TeleSoft released their compiler, which was a piece of crap, and DEC released theirs I believe in 1985. The DEC compiler was the first production quality compiler I know of, and the beta of that may have been available in late 1984. I worked on a team that worked with that beta version (due to non-disclosure, we had to call it "compiler X") and even in beta it blew away the TeleSoft compiler.

    By 1986-7 there were several Ada compilers around for various platforms suitable for doing serious work.
  • There are too many validated compiler to list here. Check http://www.adaic.com/compilers/ for a list of validated compilers. This site also lists tools, training, and other resources available. Another good site to check out is http://www.AdaPower.com
  • I work at Johns Hopkins Univ and there IS an Ada class in the catalog. At the current time there aren't any classes scheduled, because the people who need to use Ada already know it, or teach themselves "on the job." Check http://www.jhuapl.edu/sigada/index.html for information about the active Ada user's group that meets at JHU/APL.
  • revbob wrote: "Almost nobody writes Ada for fun." At least you didn't say "nobody writes Ada for fun." :-) Having learned programming by looking at source code for Basic games at U of MD, I'm a firm believer that learning should be fun. For this reason I created a "Fun with Ada" lab at http://www.adapower.com/lab/adafun.html We're collecting text and grapical adventure games, controlling model trains, cars, and robots.
  • Better Ada sites are

    http://www.AdaPower.com

    http://www.AdaIC.org

    http://www.acm.org/sigada

    Ada job listings can be found at the AdaIC site as well as http://www.adajobs.com.

    There seem to be a lot of misconceptions about Ada in this thread. Ada is a great language for both high and low level programming. The JGNAT Ada to JVM compiler is open source.

  • I actually chose to use Ada95 on a project. I found out about it because I bad-mouthed it in ignorance, then I thought "I really don't know anything about it, maybe I should research it a little before I deride it". I was pleasantly surprised and I did a moderate sized project in it that turned out well. I wouldn't say the same thing about Ada83. It lacks basic mechanisms to do polyporphism and it's task constructs are to large grained. But it just took a few small changes to end up with Ada95, which is the best general purpose language I have ever used, IMHO. What am I using now? Well, Java. It's certainly less error prone than C++. We are using it because it's a lot less error prone than C and C++ and doesn't have the stigma that Ada has.
  • 1) Ada was not designed by committee. The design team, led by a single person, did benefit from the reviews submitted by software practitioners worldwide. 2) It is not as huge as C++ or COBOL. In fact, it is designed around a few very good principles. These principles lead to a set of very rigorous rules for compiler development organized into thirteen fairly small chapters. 3)We can write very small programs or very large programs in Ada. Usually, it is used for large-scale projects where a team of developers must communicate effectively with each other rather than for one or two person projects. 4) Anyone who thinks the source code is, of necessity, verbose, simply has not read enough code. 5) The design goals of Ada are fairly straightforward. One of the most important is: the language design should allow a compiler to catch the maximum number of errors as early in the development process as possible. There are not very many languages with that design goal. Richard Riehle
  • "You speak formally when you are trying to prove something, or be academic, but to get work done, you don't wish to be bothered with the formallities."

    Much of the formality within Ada is geared towards:

    1) Non-ambiguous semantics resulting in less chance of programmer induced errors.

    2) "Programming in the large" - essentially programming large systems using large teams where interfaces are defined by one sub-team and used by another.

    3) Ease of maintenance.

    I have to assume from your comment that the "work" you do is not as part of a team, that your debugging effort is vastly more than the design effort, and that your code is never maintained, except possibly by yourself.
  • "According to my prof. ADA is used in military tasks involving 100% reliablity (submarine radar traking systems, air traffic control systems, stuff like that). "

    Are air traffic control systems a purely military thing? Ada is used most often when high reliability is required, not because it is a "complex" language, but because it is a safe language.

    As people have mentioned, it is heavily used in Air Traffic Control, Aircraft, Railway Systems and so on, both military and commercial. Pretty much anywhere where peoples' lives are at stake in the event of failure.
    For a more complete list check out:
    http://www.seas.gwu.edu/~ mfeldman/ada-project-summary.html [gwu.edu].
  • "I don't know about GNAT, but the Ada compiler that used Ada syntax in its command language sucked hard. "

    AdaWorld?

    "Anyway, the only programming projects I really enjoyed were the C ones in the third year"

    Why? Was it because you enjoyed the challenge of discovering bugs and fixing them in C more than just writing the code in Ada and having it work first time?
  • If you haven't even seen Ada, here is a link to the GNAT compiler sources. Just download it, unzip and untar it and look at the code. Then let us hear what you think of it.

    ftp://cs.nyu.edu/pub/gnat/gnat-3.12p- src.tar.gz [nyu.edu]
  • "Exception handling is Bad Programming"

    I guess that must be why they didn't implement exception handling in:

    1) C++
    2) Jave
    3) Eiffel
    :-}
  • "One feature that I would like to see in new compilers is better run-time diagnostics"

    GNAT is quite good at this. Also if you use exception handlers, Ada.Exceptions.Exception_Message on GNAT will provide you with exactly where the exception was raised, and Ada.Exceptions.Exception_Name the name of the exception.
  • "Ada was designed to work with a 48-character set."

    I have no idea where you got this idea from. Ada was designed to work with the ASCII character set originally, now (Ada 95) the default is ANSI Latin-1.

    "So it doesn't use many special characters, or even lower case letters."

    False.

    "This leads to ALL UPPER CASE PROGRAMMING, which is today considered ugly"

    Most Ada code I've seen follows the conventions first used in the reference manual. For Ada 83, keywords were in lower case with identifiers in upper case, so you could end up with e.g:

    if THE_LENGTH_OF_A_PIECE_OF_STRING = 12 then
    ...
    Personally I found this convention to be appalling and ugly and didn't use it anyway, but it was only a convention - it was not dictated by the language which is, and always has been, case-insensitive!
    In Ada 95 the convention is much better. Keywords are still lower case, but identifiers are capitalised e.g.:

    if The_Length_Of_A_Piece_Of_String = 12 then
    ...
    One of the good things about Ada is that all the important documentation, e.g the ISO Standard, the Annotated ISO Standard and the Quality & Style guides are all freely available. So you can get a compiler (GNAT) and a Language Reference Manual just to try it out without spending any money whatsoever! Go on - try it!
  • "The reason people don't use Ada is because its blocks are delmited by indentation"

    That's an interesting one - I've been programming in Ada for nearly 10 years and never knew that before! I'd be interested in knowing whether you can provide a reference to where this is defined.

    Funnily enough, before I started using Ada I'd been using C and OCCAM for a couple of years, and OCCAM did use indentation as part of the syntax.
  • "but I'm being force-fed Ada95 and have been waiting for an opportunity to vent in a public forum"

    "Force-fed" huh? I guess you would rather be learning how to make megabucks writing flaky code in a flaky language (like C++ or Java) than being provided with a solid grounding in Software Engineering.

    It sounds like you have the wrong attitude. All you need to remember is that any good programmer can write in any language, but very few languages force you to think about what you're trying to do - especially C/C++. They just let you do what you want and then expect you to sort out the mess afterwards - with Ada, you're expected to not get into that mess in the first place.
  • "Actually, I'm an Electrical and Computer Engineering major (hardware, not software),"

    Interesting. You may find at some point that you will be designing ASICs and FPGAs. If you do, there is a chance that you will be doing this with VHDL. (There's quite a lot of money in this kind of thing in the UK). Should you ever be in this position, you will find that knowing Ada is a huge benefit as VHDL is heavily based on Ada.

    Although I hope this makes you look at Ada in a new light, I won't be holding my breath, but I would suggest you investigate VHDL before writing off Ada.


  • Not even slightly true. Ada was developed effectively as a panacea to reduce the massive costs involved in maintaining and supporting all the different languages and specialised compilers in use in the DoD at the time. So basically the intention was to make a language that would satisfy pretty much everyone. Specialised it certainly was not.
  • A disk full is equal to being unable to write to the disk. The -program- shouldn't care why you can't write, it merely needs to know that you didn't. The "Why" is only relevent to the user. Same with reads. Why should the program have to care why it couldn't open the file? If it can't, it can't. It's behaviour (eg: not trying to read) should be exactly the same. Again, that's user stuff. And that does not need complex exception handling.
  • Ok, name one commercial ADA compiler that implements the FULL specification. And by full, I don't mean "just the essentials". I -mean- the full specification. Every type, every run-time binding, every instruction, every form of inheritance, -everything-.

    Cobol is a dead language. NOBODY wants to handle BCD numerics. They're slow, inefficient and (worst of all) unsafe. Too easy to corrupt.

    Exception handling is Bad Programming. One way in, One way out. Nothing more, nothing less. If it ain't drawable as a JSD, BNF or a flow-chart, it's not a real program.

  • Let's take that last one. IMHO, no operation should ever "fail". If it fails, it means that you coded for a narrower range of conditions than the program encounters. If, on the other hand, you had looked at what inputs the program -could- receive, in the extreme, and coded for -that- range, the functions would always succeed.

    (The only exceptions are functions which allocate memory, as that involves circumstances outside of your control as a programmer.)

    Let's take that example of concatenating a string, as an example. Easy! Just get the length of each string, malloc a buffer equal to the sum of the lengths, plus one, and put the strings in that. You are -guaranteed- that the buffer will never overflow, because you've coded the solution in such a way as to make that impossible.

    (Fixed-length buffers are the ones that kill. Fixing anything before the fact is a good way to suffer bugs, fatal and - worse - non-fatal.)

    Another classic one is the square root function. Two possibilities. First, take as a parameter an unsigned value. If it's unsigned, it can never be negative, so your function will -always- be valid. Alternatively, take a signed value and return a signed float. If the value >= 0, then it's a real solution. If it's The trick is to always define the pre-conditions and the post-conditions. If you typecast in such a way as to guarantee the pre-conditions, then you guarantee the post-conditions will also be met. Alternatively, expand both pre-conditions and post-conditions until, again, the limits of the types guarantee all conditions are met.

  • Neither. You have a null record. As you return the record as a pointer, the valid response is to return a null pointer.

    Raising an exception means you have TWO paths out of a routine, which is Bad Ju-Ju in programming. Making up data is the same as not initialising your variables - another form of Bad Ju-Ju.

    Write Clean Code, because Clean Code is guaranteed to work. Spaghetti Code, with exceptions jumping around like a pogo stick on speed, is GUARANTEED to fail. Why? Because you can't possibly think of every possible exception. But, if your code is inherently designed to handle every possible case BY DESIGN, you don't need to care what those cases are.

  • There was a short-lived mandate at NASA to use Ada for new software. I thought it was a good idea. It died a quick death when people asked for funding for training and compilers. This was before GNAT, when compilers were very expensive.

    I still think there is a need for a language with strong type checking, range checks, platform independence and good bindings to the OS and GUI.

    One feature that I would like to see in new compilers is better run-time diagnostics. I used to use a DEC FORTRAN compiler that produced programs that would print out useful error messages, such as "divide by zero error at line 330 in module FOO", instead of just crashing.

  • OK, this is a site with a great deal of Ada advocacy:

    http://www.adahome.com/

    I've not used Ada (it's on my list of things to try), but it looks like it's got a lot going for it.

    I personally like protecting developers from themselves to some extent, because it results in more robust software (depending on the choice of language, no mysterious overflows or pointer problems). This is why at my job we use Java rather than C++ ... it produces much slower code, but is much faster to develop.

    Roy Ward.
  • ... And the first few years of my CS schooling were nearly unbearable. It was very similar to being forced to speak in very proper English all the time, calling every person you meet by their complete names, and annunciating each and every syllable.
    Looking back, I believe that ADA is a good language to learn for a first language. Just like being forced to take grammar and phonics in grade school. The teachers force you to speak properly during class, but you find more concise and effective communication possible when you talk with your friends using slang, vernacular, etc. You speak formally when you are trying to prove something, or be academic, but to get work done, you don't wish to be bothered with the formallities. I think that's ADAs biggest problem.
    Either that or the fact that the ADA standard wasn't changed until 1995, well after other languages gained momentum and wide spread acceptance in the academic community.

  • <i>Ada blocks are delmited(sic) by indentation</i>

    Just in case someone missed the significance of the Python reference (which does use indentation to indicate blocking), Ada uses a variant of begin/end delimiting that should be familiar to most C/C++ programmers.

    Really, what's the difference between

    void foo (void)
    {
    if (a < b) {
    c();
    }
    else {
    d();
    } /* end if */
    } /* end 'foo' */

    and

    Function foo is
    begin
    if (a < b) then
    c;
    else
    d;
    end if;
    end foo;

    I've worked in C for 15 years, vs. 3 years in Ada, so I find the uncommented C cleaner for simple procedures. For complex procedures, or if the shop requires /* end loop */ type comments on all closing braces I find Ada cleaner. In any case, Ada definitely requires explicit delimiters.
  • My understanding is that Ada was designed, in part, to help keep the programmer from making some kinds of mistakes. My feeling is that this philosophical influence on a language tends to lead to a more cumbersome language. Indeed, Ada is a complex enough language that it took years to create a certified version of the compiler.

    I worked as a civilian consultant at an Air Force base for several years. I remember that there was a fierce resistance to using Ada for many years after the US Department of Defence issued its mandate that Ada was to be used for everything unless there was a clear justification. Even though there was a clear recognition of the fact that the multitude of languages used on individual systems and weapons platforms, NOBODY wanted to use Ada.

    Adrian
  • Ada is taught at my school for both Comp Sci I and II. The reason given for teaching ADA is that it's a "stongly typed" language and it's pretty fair with error recovery. According to my prof. ADA is used in military tasks involving 100% reliablity (submarine radar traking systems, air traffic control systems, stuff like that).

    There are almost no professional opportunities for ADA programmers, unless you would like to work for Rockwell (no offence to those Rockwell folks, but you are the only people I know hiring ADA programmers.) Ada is almost SO strongly typed, it's almost code prohibitive; I get bogged down in the structures and not the implementation of the solution.

    However, here are some links for ya'll
    ADA Reference Manual [adahome.com]
    a prof's home page [uni.edu]
  • Ada is a good language with several cosmetic problems.
    • For compatibility with old 1970s hardware, Ada was designed to work with a 48-character set. So it doesn't use many special characters, or even lower case letters. This leads to ALL UPPER CASE PROGRAMMING, which is today considered ugly. It also leads to verbose code.
    • Objects came late to Ada, as they did to C, but C++ got there first.
    • Ada was considered a big, complex language for its time. ISO C++ is bigger and more complicated, but programmers were sucked in by C, so they didn't have to face the mess. But go over to USENET comp.lang.c++.moderated and look at the terrible problems people are having with C++'s hokey "template" mechanism for generics.
  • First, ADA is =so= complex, that until recently there WERE no "complete" ADA compilers. This meant that you had lots of ADA dialects, each a subset of the "real" ADA language, and that a program that worked on one dialect might not even compile on another.

    Secondly, ADA compilation is slooow. This is inevitable, when you have a language that supports typing after the fact, amongst other obscure but powerful features.

    Thirdly, nobody has ever really promoted ADA, other than the DOD. And what geek in their right mind would trust a super-paranoid organisation, when it's telling you that something is for your own good?

    Fourthly, nobody but nobody has written a decent optimising ADA compiler. Nobody knows how! There has been almost zero research into 3.5th generation languages and how to tightly optimise the generated code.

    Fifthly, until GNATS came along, many ADA compilers cost as much as the machine they were to run on. Microsoftian in the extreme! Anyone with any sense would have said no to the ADA tax. (The COBOL tax is even worse! And COBOL is an all but dead language!)

    Sixthly, nothing interesting has been written in ADA, so geeks, nerds and coders have no incentive to grok it. (If someone were to port the Linux kernel to ADA, I'm sure interest in it would sky-rocket.)

    Lastly, too many bad memories of lecturers who keep their brain cells in their tuna-fish sandwich, telling people that the best way to debug a program is to throw in more exceptions. Sorry, but if the program was written sensibly in the first place, there would BE no exceptions. Exception-handling is a way to write sloppily, with no understanding of what you're actually doing.

  • by FigWig ( 10981 ) on Monday May 08, 2000 @02:26PM (#1086116) Homepage
    The reason people don't use Ada is because its blocks are delmited by indentation. Every knows that's just ugly. Real hackers would use a language like Python instead.

    That and it's named after a girl.

  • by zmower ( 20335 ) on Monday May 08, 2000 @03:14AM (#1086117)
    We now have Ada95 output from GLADE [pn.org] targeting the GtkAda [eu.org] binding.

    I have both Ada and C experience. Ada thrashes C for non-trivial programs. Maybe the problem is that most open source software starts out as trivial programs that scratch a programmers itch.

    C++ and Ada95 are roughly equivalent. To my mind C++ suffers from its class-centric view of everything. Look at the hassles C++ has with singletons while Ada neatly solves this with the package structure.

    adapower [adapower.com] is a good place to start for those interested.

  • by Skweetis ( 46377 ) on Sunday May 07, 2000 @12:57PM (#1086118) Homepage
    Ada.Text_IO.Put_Line(Item => "Hello, World!");

    -- OR --

    printf("Hello, World!\n");

    Ada95 source code takes up twice as much disk space as C source! All joking aside, I don't use Ada because there aren't very many library bindings for it. GNAT is a pretty good compiler, though, and Ada fixes a lot of problems inherent in C, but Ada probably just isn't used because C/C++ are the established languages of choice for programming on Unix/Linux systems (most of the OS being coded in C).
  • by remande ( 31154 ) <remande@bigfoot. c o m> on Monday May 08, 2000 @01:40AM (#1086119) Homepage
    A lot of people have given a lot of technical reasons for not choosing Ada. However, I think that there is one, larger reason for this. Ada has been considered beneath contempt by most geeks since before most geeks were coding. It's one of those things that we are supposed to hate as a matter of culture. I've read a bunch of stuff about how it is built by committee with all the problems thereof, but I've never actually seen it, and I can count on one hand the number of people I know who have used it.

    If you don't hate Ada and COBOL, you get shunned by hackerdom. You are allowed to code in either, but must swear loudly while doing so. People who actually choose to generally get discounted as idiots.

    I'm not saying that the above is right, but only that the above is so.

  • by CFN ( 114345 ) on Sunday May 07, 2000 @05:30PM (#1086120)
    Having done my undergraduate CS at NYU, the 'N' in the Gnu Nyu Ada Translator (GNAT - how many people knew that), and having studied under two of the developers of that system (their company's site [gnat.com]) I was fortunate enough to be exposed to virtues of this language.

    The Ada syntax is very similar to Pascal's, but the language is very similar to Java (especially thread support). Once you get accustomed to using the language, you notice that you can be very, very productive using it, much more than in C++. I would also say that the learning curve (minus the time to familiarize yourself with the syntax if you are coming from the C/C++ world) is pretty easy to climb, and you can be proficient at it soon.

    For my compiler class, we needed to write a compiler for a (the non OO) subset of Ada95. I chose to write mine in Ada95, because we were allowed to use the GNAT lexer/parser (which was written in A95 as well). I would strongly recommend that anyone wanting to learn large scale programming or who is developing something that will be open source, should take a look at the source code [nyu.edu]. It is clearly written, well documented, easily understood, and even beautiful. It is a wonderful example of how someone should write code if the expect others to read it. It also shows just how beautiful and understandable the language is.

The moon is made of green cheese. -- John Heywood

Working...