# Algol 68 design mistakes?

In article <3A185D33.B570E60F@jetnet.ab.ca>, Ben Franchuk <bfranchuk@jetnet.ab.ca> wrote: >> Neither Pascal nor C has a proper >> way of saying that "pi" is what you get when you evaluate "4*arctan (1)" >> and is *not* a variable, though both have bolt-on features that go part >> of the way. > >I think #define PI 3.14 is fine. Unfortunately not. One obvious problem is that most (all?) debuggers are blind to `#define'. More seriously, you presumably aren't really advocating using only 2 decimal digits of precision: I wouldn't want to cross any bridge which was designed with the help of a program that believed pi to be 3.14! So in a real program you might prefer to say #define PI 3.141592 which is as many significant digits as a float can hold on the machine I'm using right now. Or maybe it should be this, to use all the digits of a double (again, on my current machine)? #define PI 3.141592653589793 But then, what about next year when I get a fatter processor; or 5 years' time, 10 years' time? It turns out that it really is better to let the machine calculate pi for itself. But C bites you again: #define PI (4 * atan(1.0)) will be evaluated each time PI is used. Of course, in the particular case of pi, you should be using M_PI from <math.h> instead, but I hope I've demonstrated the original point. Tim. -- Tim Goodwin | "Not ideal, I grant you, but life's Leicester, UK | like that." -- Ian Batten

Original headers:

From: tjg@star.le.ac.uk (Tim Goodwin) Newsgroups: comp.lang.misc Subject: Re: Algol 68 design mistakes? Date: 27 Nov 2000 13:48:57 -0000 Message-ID: <8vtoov$uso$1@fozzie.star.le.ac.uk> References: <8vp5gm$4f0$1@oyez.ccc.nottingham.ac.uk> <3A185D33.B570E60F@jetnet.ab.ca>