[NTLUG:Discuss] Ada...etc. (was Re: programming classes...)
Darin W. Smith
darin_ext at darinsmith.net
Tue Jul 1 18:08:37 CDT 2003
On Tue, 1 Jul 2003 17:13:54 -0500 (CDT), Bug Hunter
<bughuntr at one.ctelcom.net> wrote:
> I had not hears that the DoD had stopped strictly requiring Ada. That is
> interesting.
>
>
> bug
People were starting to use some C and C++ at least by 1997. NASA had
relaxed their Ada requirements even earlier, and when I was at JSC in 1996
everything I worked on was being implemented in "fancy C" (C++ compiler
used to compile C with //'s and classes as fancy structs).
As for the DoD relaxing the requirement, how else could Microsoft get their
foot in the door for their OS'es to be used on programs? Food for
thought... In 1997, while I was writing code in Ada for one Navy project I
was reading about the Navy's "smart destroyer" prototype that ran on
Windows NT and was dead in the water for 4 hours because some database had
a zero in it and it crashed the OS. I though to myself "well...here we
go...if we have to trust NT for our national defense, we're all dead."
You can ignore the rest of this if you would like (what am I saying, this
is email! You can ignore and delete the whole bloody thing if you like!
Email is nice that way). It is very long and is only one guy's opinion...
<SOAPBOX>
As for taking loopholes to the extent of voiding the purpose of Ada: I
would counter that most programmers who take extreme measures to get around
features of a language (any language) probably do not understand how to use
the language (feature) properly. In addition, if they will take such
measures, they will usually do similar things in very sloppy ways. For
example, many people "cast away" lint errors. That is, they put in
unneccessary (and wrong) casts in order to get lint to quit complaining
rather than take a step back and see what it is that lint is trying to draw
their attention to. Backing up and looking at what you did in your code
design to set off one of the compiler's error/warning rules is almost
always:
a) the right thing to do.
b) more work.
c) requires more skill.
d) all of the above.
I'd say (d) is the correct answer. It may be that the error is not really
an error, in which case you can specifically tell lint not to flag it
(there are certain pragmas in most Ada compilers that provide a hint to the
compiler that you are doing something on purpose and not to punish you for
it). Then, you have taken the time to understand why it barfed, and
verified that this is what you really intended. It also tends to stand out
in code-reviews, and you can then say why you did this. Even better,
comment the code at that point to explain why it is necessary. Simply
coming up with a way to get around the error generation without analyzing
for design problems is a recipe for more bugs down the road...but this is
how many people use lint, and how many people have used Ada in the past
(and how many people use "C-style" casts in C++). It is unfortunate. Ada
makes it difficult for a reason, and that reason is to make you think about
how you can avoid a potentially dangerous practice in your code design.
But unfortunately, we live in a schedule-driven world. That's how we got
things like Windows. Focusing on features and release dates, not
correctness and completeness of design and implementation.
I'll tell you right up front, that the defense and (to a lesser extent,
unfortunately) medical industries have far more stable coding practices
than the vast majority of commercial S/W. Not as flashy, but an emphasis
on getting things right. I'll take "works well" over "looks nice" any day
of the week when my life is on the line, thank you very much. In fact,
I'll take that even when my life isn't on the line. You can always make it
look pretty later. I want things that work. That's the whole reason I
support Linux. It can be frustrating when things don't work, but this is
usually when I am using releases of software marked "alpha" or "non-stable"
or "beta", so I know and expect that they won't work perfectly...especially
for what I'm paying. On the other hand, if I go down and pay $500 for an
Office suite and it is full of bugs and bloat, I feel completely ripped
off. When developing, after I get raw functionality established, I tend to
turn all warnings into errors for at least a few days, so that I can see
and correct what the compiler (and lint, if I can use it too) can find for
me. Believe me, it saves a lot of time in the long-run.
90% of the time, when a co-worker has had a problem getting some code to
"work right" and requests my help, I run it through lint or through the
compiler with -pedantic and immediately get pointed to the problem. It is
usually a misunderstanding of a feature of the language. Thus the danger
of using language features you don't understand. That's the reason so many
company coding standards for C++ forbid the use of templates, friends, and
multiple-inheritence (some of the most powerful features of the language).
The writers of the coding standards are usually afraid that too few
developers know how to properly use those features, so they forbid them.
They may forbid them because they themselves don't know how to use them
right. That's my biggest complaint about C++. It not only gives you more
than enough rope to hang yourself with, but gives you forty different
gallows, as well as an electric chair and a guilotine with which to try and
execute yourself. There is an executioner to help you commit suicide, but
no good person around to show you the way out of sure self-destruction. So
we have coding standards to tell you what features of a language not to
use, and how to use certain other features.
I would opine that a truly good language design is one that has only truly
necessary *and useful* features, and only one way to use each feature. If
you look at Ada's design, that was really a guiding principle. On the
other hand, many assembly languages fit that bill well also (though some do
not--there has been an explosion of mnemonics in PPC assembly, for example)
. Neither Ada nor most dialects of assembly are particularly easy for the
beginner, though--another rule for truly good language design. When you
look at it, FORTRAN77 and BASIC were both very good language designs.
They've just fallen out of favor since everyone these days prefers
procedural-based and object-oriented languages. They are good languages
(by my definition) in that they have a limited, simple to understand syntax
with well-defined features and rules for the usage of those features.
</SOAPBOX>
--
D!
Darin W. Smith
AIM: JediGrover
"If you pick up a starving dog and make him prosperous, he will not bite
you. This is the principal difference between a dog and a man." --Mark
Twain "Pudd'nhead Wilson's Calendar"
More information about the Discuss
mailing list