signgam symbol missing from i386 (Intel) math.h
signgam symbol missing from i386 (Intel) math.h
- Subject: signgam symbol missing from i386 (Intel) math.h
- From: Chad Armstrong <email@hidden>
- Date: Thu, 1 Jun 2006 22:47:25 -0500
I'm trying to compile a mathematical parser as a Universal Binary,
and it compiles fine under the GCC 3.3 for PPC compiler. But once I
set it to try and create a build for Intel chips, I get this error:
error: 'signgam' undeclared (first use in this function)
From my research, it seems that this is happening because the
signgam variable is declared in the PPC version of math.h (/usr/
include/architecture/ppc/math.h), but it is missing in the Intel
math.h (/usr/include/architecture/i386/math.h).
I tried adding the following line to either the i386 math.h, or
inside my program:
extern int signgam;
However, if I do that, I get this other error:
Undefined symbols:
_signgam
collect2: ld returned 1 exit status
For those interested, here is the use of the signgam variable which
helps to calculate odd factorials, like 3.1!
double factorial(double val)
{
double g, lg;
lg = lgamma(val+1);
g = signgam*exp(lg); /* signgam is a predefined variable */
return (g);
}
Any ideas on how I can get this to compile correctly for Intel
systems? And even if I get it running properly on my machine, will
it work properly on other Intel machines? Right now I'm using a
PowerMac G4, running XCode 2.2 (I'm currently downloading Xcode 2.3).
Chad
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Cocoa-dev mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden