mach_absolute_time() vs. sleep()
mach_absolute_time() vs. sleep()
- Subject: mach_absolute_time() vs. sleep()
- From: Kristopher Matthews <email@hidden>
- Date: Tue, 29 Apr 2008 20:01:27 -0500
I'm having some strange trouble with these two calls. Example code
follows.
double mach_elapsed_time(double start, double endTime)
{
uint64_t diff = endTime - start;
static double conversion = 0.0;
if (conversion == 0.0) {
mach_timebase_info_data_t info;
kern_return_t err = mach_timebase_info(&info);
if (err == 0)
conversion = 1e-9 * (double) info.numer / (double) info.denom;
}
return conversion * (double) diff;
}
uint64_t s = mach_absolute_time();
NSLog(@"test");
double duration = mach_elapsed_time(s, mach_absolute_time());
At this point, "duration" is a reasonable value in seconds. (About
0.005 IIRC.) This code also works for measuring another block of code
I have that write several mbs to disk - the time it reports is in line
with the difference between NSLog statements. But this:
uint64_t s = mach_absolute_time();
sleep(6);
double duration = mach_elapsed_time(s, mach_absolute_time());
Produces completely unrealistic results - this specific example comes
in at 0.387 seconds.
Any thoughts? (I know, I know. this is a BS test case. I just happened
upon it and I'm curious why this happens. I have no other problem with
timing in this manner.)
Regards,
Kris
Attachment:
smime.p7s
Description: S/MIME cryptographic signature
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Darwin-dev mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden