site_archiver@lists.apple.com Delivered-To: darwin-dev@lists.apple.com This single call to free() takes about 1,100 microseconds. Any suggestions would be appreciated! Thanks in advance! _______________________________________________ Do not post admin requests to the list. They will be ignored. Darwin-dev mailing list (Darwin-dev@lists.apple.com) Help/Unsubscribe/Update your Subscription: http://lists.apple.com/mailman/options/darwin-dev/site_archiver%40lists.appl... I've got a time critical application that must run very, very fast. The application is written in C++ with Project Builder, running on MacOSX. The application runs fine, but there is a speed bottleneck I have: I've traced it down to a single call to free() which is freeing a memory buffer (approx 1 MB) that was earlier allocated with malloc(). A normal call to free() takes around 5 to 50 microseconds ... I've timed them in several test programs on the Mac and Windows. Even the other calls to free() within this program run fast. Does anyone have any suggestions why this one call to free might be taking so long? I need to get this call to free() to run much faster. The only clue I have is that this memory buffer has been heavily used (lots of reading and writing) and also this application is rather large (500 MB Vsize virtual memory) so I'm sure that virtual memory is getting paged in and out. I'm unable to get inside the free() call in the debugger, so I have no idea what it is doing inside that takes so long. There is no simple way to re-design the program to avoid the free(): I need to repeately allocate and free memory buffers, each of a variable size (and there is no way to predict ahead of time the sizes, even the maximum size). This email sent to site_archiver@lists.apple.com