Re: Memory issues when loading a document
Re: Memory issues when loading a document
- Subject: Re: Memory issues when loading a document
- From: Ulf Dunkel <email@hidden>
- Date: Tue, 23 Oct 2012 19:49:51 +0200
- Organization: invers Software & DSD.net
> Important bit you haven't told us: is your app 32bit, 64bit, or both?
Oh, sorry, it's a 32bit app which still supports at least Mac OS X
10.4.11. We're already working on the 64bit version of that app, but
this is a current version's issue which I would like to understand and -
hopefully - be able to fix even for the 32bit version.
> You have very fine-grained control available to you. Read up on the
> document architecture and the different NSDocument methods you can
> override.
As far as I can see, the loading behavior seems to be very ineffective.
Firstly, OS X loads the document. To be able to decode it, it has to be
loaded into the RAM. This needs the mentioned 900 MB RAM.
Then the Coder decodes the objects which gets the same stuff a second
time into the RAM.
Finally, when decoding PDFs, OS X creates another data segment for each
PDF object, decodes it and creates the PDF content.
So we have each PDF object being three times in the RAM:
1) in the loaded raw document,
2) in an object created from the document (containing the PDF raw data),
3) the PDF itself.
After this process has been done for the whole document, the RAM is
freed again to only keep the required about 900 MB.
To me, this seems to be very ineffective. I wonder if there is any way
to not decode the document as a kind of monolite, but decode and convert
object by object, deallocating RAM after a single object has been
created, not when the whole document has been processed.
---Ulf Dunkel
_______________________________________________
Cocoa-dev mailing list (email@hidden)
Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden