Re: Running out of memory with Core Data
Re: Running out of memory with Core Data
- Subject: Re: Running out of memory with Core Data
- From: Jed Soane <email@hidden>
- Date: Wed, 27 Jul 2005 10:09:25 +1200
Are you checking to see if the transaction already exists in the data
store? That really slows you down. We were importing 300,000 items
and it was taking hours. 20,000 items took 10 minutes. To improve the
speed we stored the managed objects in a dictionary (per entity).
Then to check if we had already imported the particular item we just
checked the dictionary rather than the managed object context, eg
- (NSManagedObject*)ingestProject:(NSString*) projectName
{
NSManagedObject* project = [_projectDictionary
valueForKey:projectName];
if (project == nil)
{
NSLog(@"Adding new project: %@", projectName);
project = [NSEntityDescription
insertNewObjectForEntityForName:@"Project"
inManagedObjectContext:_managedObjectContext];
[project setValue:projectName forKey:@"name"];
[_projectDictionary setValue:project forKey:projectName];
}
return project;
}
This took the import if 20,000 items down to 20 seconds and 300,000
items to several minutes. We could have optimised this more but we
considered that acceptable for a one-off event.
Cheers
Jed
On Jul 27, 2005, at 2:23 AM, Ian G. Gillespie wrote:
I added autorelease pools and it no longer runs out of memory.
Thanks for the help.
Unfortunately, it still remains TERRIBLY slow. I used Shark but it
seems that the calls that take the most time are private methods
like _NSGetUsingKeyValueGetter and such. To import 10 accounts
with 100 to 2000 transactions each takes several minutes.
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Cocoa-dev mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden