Re: Core Data app becomes a memory hog
Re: Core Data app becomes a memory hog
- Subject: Re: Core Data app becomes a memory hog
- From: Christian Weykopf <email@hidden>
- Date: Tue, 5 Jun 2007 10:04:52 +0200
Am 05.06.2007 um 09:32 schrieb Hal Mueller:
Hi Hal,
i ran into this recently.
It is an autorelease problem. Try something similar to this:
NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init];
for (i = 0; i < 20000; i++)
{
createObject ();
if (i % magicalNumber)
{
[pool release];
pool = [[NSAutoreleasePool alloc] init];
}
}
Chris
Summary: my Core Data app creates objects programmatically, and
fairly quickly (20,000 objects) makes the rest of the machine
unusable until the app crashes.
I have a Core Data project running that uses a live data stream to
populate the datastore. The model has a bunch of moving objects
(ships) which periodically are updated via the data stream. I've
defined 4 entities: one for the objects, one for the raw incoming
data packets, and two for a couple of types of report that I'm
interested in (processed versions of the data packets).
I am creating the managed objects programmatically, and setting the
relationships as the objects are created. That is, when a new data
packet comes in, I save the packet, update the associated ship, and
create a relationship between the ship and the packet (send
"setSender:" to the packet, where "sender" is a relationship I
defined in the data modeler). I might also create a second object
for that data packet in a different table, and relate that object
to its ship too.
By the time I get up to 150-200 ships, and 10000 or more data
packets, I begin to have memory problems. I can see in Activity
Monitor that the virtual memory hits 1.5 GB quickly, and eventually
gets up to 3+ GB. Shortly after this the program crashes while
trying to create a new managed object.
I can bring the machine (dual G5/3 GB RAM) to its knees by doing a
search from the prototype UI while the program is reading data
packets.
I generally use the SQL backing store, but performance of the
program is equally poor regardless of the store type. Performance
is worse on my MacBook (Intel dual core/2 GB RAM).
I have a "saveAndRefault" method defined which does a save: on the
datastore, then iterates over [datastore registeredObjects] and
calls refreshObject:mergeChanges:NO. Thus I am turning all of my
objects back into faults. If I do this periodically (every minute
or so), memory stays manageable.
I could set up a timer to call saveAndRefault a lot, but this feels
like I must be doing things the hard way.
Comments, anyone?
Hal
_______________________________________________
Cocoa-dev mailing list (email@hidden)
Do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com
Help/Unsubscribe/Update your Subscription:
40meilenstein.de
This email sent to email@hidden
Mit freundlichen Grüssen
Christian Weykopf
--
Meilenstein Mac OS Software
Neue Strasse 5
D-31582 Nienburg
Fax: +49 (0) 5021 91 24 45
<http://www.meilenstein.de/>
Geschäftsführer:
Georg Hennig, Dirk Musfeldt, Christian Weykopf
Amtsgericht Walsrode, 8 HRB 31265
_______________________________________________
Cocoa-dev mailing list (email@hidden)
Do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden