Re: crashes loading saved file
Re: crashes loading saved file
- Subject: Re: crashes loading saved file
- From: James Maxwell <email@hidden>
- Date: Mon, 28 May 2012 18:51:05 -0700
Hi Charlie,
Thanks for the reply.
Hmm… I wonder if it would be enough to just copy the objects that lead to circular references? I'll think about it… I agree that it would be worth a try, if only to determine whether the problem really is related to circular references.
thanks,
J.
On 2012-05-28, at 6:29 PM, Charlie Dickman wrote:
> J,
>
> If it were my problem, even though keyed archiving is supposed to handle circular references, I would try eliminating the circular references and see if it made a difference (even if you have to duplicate objects multiple times as this would only, at this time, be a test).
>
> I have no credentials to validate this approach other than 35 years of professional programming experience.
>
> On May 28, 2012, at 6:14 PM, James Maxwell wrote:
>
>> Okay, so I'm back to trying to tackle this annoying unarchiving crash…
>>
>> Just to recap the problem: I get a exc_bad_access crash when unarchiving certain files from disk. The file is a keyed archive, which contains a fairly complex custom object graph, with plenty of circular references (i.e., parentNode <---> childNode stuff). When this graph is relatively small I have no problems. But when it gets larger, it crashes. As mentioned previously, one seemingly significant thing about the crash is that the backtrace is >25,000 frames long. I've taken this to suggest that perhaps: A) some circular reference is getting stuck in a loop, or B) the graph is large enough that, while the unarchiver is trying to keep track of circular references, the stack overflows. I don't know if either of these possibilities makes sense, so I'm wondering how I might test for each?
>>
>> Partly because the massive backtrace isn't just a list of identical calls, and partly because the unarchiver is supposed to handle circular references, I kind of suspect B. But, if this is the case, how can I get around it? I already use archiveRootObject:toFile for archiving, so I would think I should be exploiting the built-in recursion checking stuff… Accordingly, I use unarchiveObjectWithFile to unarchive the graph. Everything I've done is pretty basic stuff, so perhaps my structure calls for a more advanced approach(??) I did include @autoreleasepool blocks in a couple of places, where temporary objects could be created during initialization. But that didn't help…
>>
>> So, I guess I'm wondering whether anyone else has had similar problems, and how you went about solving them. I should also mention that the file itself is very big - even a 13 MB file will cause the crash.
>>
>> By the way, it's not the super common failure to retain the unarchived object… Also, NSZombies and Guard Malloc don't show any obvious problems, and the static analyzer shows no memory errors.
>>
>> Any further thoughts greatly appreciated.
>>
>> J.
>>
>>
>>
>>
>>
>> James B Maxwell
>> Composer/Doctoral Candidate
>> School for the Contemporary Arts (SCA)
>> School for Interactive Arts + Technology (SIAT)
>> Simon Fraser University
>>
>>
>> _______________________________________________
>>
>> Cocoa-dev mailing list (email@hidden)
>>
>> Please do not post admin requests or moderator comments to the list.
>> Contact the moderators at cocoa-dev-admins(at)lists.apple.com
>>
>> Help/Unsubscribe/Update your Subscription:
>>
>> This email sent to email@hidden
>
> Charlie Dickman
> email@hidden
>
>
>
James B Maxwell
Composer/Doctoral Candidate
School for the Contemporary Arts (SCA)
School for Interactive Arts + Technology (SIAT)
Simon Fraser University
_______________________________________________
Cocoa-dev mailing list (email@hidden)
Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden