• Open Menu Close Menu
  • Apple
  • Shopping Bag
  • Apple
  • Mac
  • iPad
  • iPhone
  • Watch
  • TV
  • Music
  • Support
  • Search apple.com
  • Shopping Bag
 

Lists

Open Menu Close Menu
  • Terms and Conditions
  • Lists hosted on this site
  • Email the Postmaster
  • Tips for posting to public mailing lists
Re: Optimizing Core Data for large time series
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Optimizing Core Data for large time series


  • Subject: Re: Optimizing Core Data for large time series
  • From: Peter Passaro <email@hidden>
  • Date: Thu, 10 May 2007 15:12:05 +0100

Please ignore the previous message in this thread. It was my own stupidity in not resetting my NSData objects - they were growing exponentially on each save to store.

I wrote:
I wanted to report back after a little more experimentation. I have found the optimum BLOB size for my application to be about 1Mb - the import and data access speed is reasonable at this size. What is concerning me now is that the storage overhead seems to be pretty hefty for placing these BLOBs inside an SQL persistent store. For each 1Mb BLOB placed in the store, I am adding roughly 20Mb to the SQL file. Can anybody give me some direction as to what all that overhead is? Is this typical for NSData objects stored as BLOBs in SQL stores?



Peter Passaro


_______________________________________________

Cocoa-dev mailing list (email@hidden)

Do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden


  • Prev by Date: Re: NSMenu not displaying items added programmatically
  • Next by Date: Re: Object Scope
  • Previous by thread: Re: Optimizing Core Data for large time series
  • Next by thread: Drop image on NSImageView in panel
  • Index(es):
    • Date
    • Thread