Re: signgam symbol missing from i386 (Intel) math.h
Re: signgam symbol missing from i386 (Intel) math.h
- Subject: Re: signgam symbol missing from i386 (Intel) math.h
- From: Dominic Blais <email@hidden>
- Date: Thu, 1 Jun 2006 21:27:30 -0700
On Jun 1, 2006, at 8:47 PM, Chad Armstrong wrote:
I'm trying to compile a mathematical parser as a Universal Binary,
and it compiles fine under the GCC 3.3 for PPC compiler. But once
I set it to try and create a build for Intel chips, I get this error:
error: 'signgam' undeclared (first use in this function)
Firstly, AFAIK, the GCC 3.3 provided with Apple's development tools
won't cross compile to Intel. From http://developer.apple.com/
documentation/DeveloperTools/Conceptual/cross_development/
UniversalBinaries/chapter_4_section_1.html :
"you must use GCC 4.0 to compile for Mac OS X v10.4 and for
Intel-based Macintosh computer"
Beyond that, using GCC 4.0.1 and 4.2 on an Intel Mac, I found signgam
was available. However, it is no longer listed in the lgamma man
page and I don't think it's supported. Assuming a valid parameter
passed to lgamma, signgam gives the sign of the gamma function as
either 1 (positive) or -1 (negative). But, I find it's always
returning -559038737.
Cheers,
Dominic
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Cocoa-dev mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden