correct • elegant • free

△ comp.lang.misc △

◅ Derivation of PL/I

Algol 68 design mistakes?

In article <>,
Ben Franchuk  <> wrote:
>>                                   Neither Pascal nor C has a proper
>> way of saying that "pi" is what you get when you evaluate "4*arctan (1)"
>> and is *not* a variable, though both have bolt-on features that go part
>> of the way.
>I think #define PI 3.14 is fine.

Unfortunately not.  One obvious problem is that most (all?) debuggers
are blind to `#define'.

More seriously, you presumably aren't really advocating using only 2
decimal digits of precision: I wouldn't want to cross any bridge which
was designed with the help of a program that believed pi to be 3.14!

So in a real program you might prefer to say

    #define PI 3.141592

which is as many significant digits as a float can hold on the machine
I'm using right now.  Or maybe it should be this, to use all the digits
of a double (again, on my current machine)?

    #define PI 3.141592653589793

But then, what about next year when I get a fatter processor; or 5
years' time, 10 years' time?

It turns out that it really is better to let the machine calculate pi
for itself.  But C bites you again:

    #define PI (4 * atan(1.0))

will be evaluated each time PI is used.

Of course, in the particular case of pi, you should be using M_PI from
<math.h> instead, but I hope I've demonstrated the original point.

Tim Goodwin   | "Not ideal, I grant you, but life's
Leicester, UK | like that." -- Ian Batten

Original headers:

From: (Tim Goodwin)
Newsgroups: comp.lang.misc
Subject: Re: Algol 68 design mistakes?
Date: 27 Nov 2000 13:48:57 -0000
Message-ID: <8vtoov$uso$>
References: <8vp5gm$4f0$>

△ comp.lang.misc △

◅ Derivation of PL/I