Why does free() take 1,100 microseconds?
Why does free() take 1,100 microseconds?
- Subject: Why does free() take 1,100 microseconds?
- From: "neal olander" <email@hidden>
- Date: Tue, 26 Jul 2005 07:19:12 -0700
I've got a time critical application that must run very, very fast. The
application is written in C++ with Project Builder, running on MacOSX.
The application runs fine, but there is a speed bottleneck I have: I've
traced it down to a single call to free() which is freeing a memory buffer
(approx 1 MB) that was earlier allocated with malloc().
This single call to free() takes about 1,100 microseconds.
A normal call to free() takes around 5 to 50 microseconds ... I've timed
them in several test programs on the Mac and Windows. Even the other calls
to free() within this program run fast.
Does anyone have any suggestions why this one call to free might be taking
so long? I need to get this call to free() to run much faster.
The only clue I have is that this memory buffer has been heavily used (lots
of reading and writing) and also this application is rather large (500 MB
Vsize virtual memory) so I'm sure that virtual memory is getting paged in
and out.
I'm unable to get inside the free() call in the debugger, so I have no idea
what it is doing inside that takes so long.
There is no simple way to re-design the program to avoid the free(): I need
to repeately allocate and free memory buffers, each of a variable size (and
there is no way to predict ahead of time the sizes, even the maximum size).
Any suggestions would be appreciated! Thanks in advance!
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Darwin-dev mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden