Re: CoreData - large data set is slow to load on app launch - optimisation tips?
Re: CoreData - large data set is slow to load on app launch - optimisation tips?
- Subject: Re: CoreData - large data set is slow to load on app launch - optimisation tips?
- From: Ruslan Zasukhin <email@hidden>
- Date: Sat, 03 Dec 2005 22:31:14 +0200
- Thread-topic: CoreData - large data set is slow to load on app launch - optimisation tips?
On 12/3/05 9:57 PM, "Simon Liu" <email@hidden> wrote:
Hi Simon,
>
The image data is around 62MB. When I removed it from my main entity,
>
the sql store was reduced from 80MB to 18MB and loaded in around 2
>
seconds on launching the app. A dramatic improvement.
Right.
But your db have become 4 times less.
>
Yes, currently the image data is an attribute of an entity. It was an
>
oversight. I'm going to make it into a relationship to see how that
>
improves things. The rest of the data model is pretty lean.
>
I wonder if the coredata folk can tell me if its a good idea to have
>
data blobs like images all stored in a separate data store or not?
Why not. This is normal trick.
Let me explain original problem.
SqlLite stores BLOBs as part of record. When you goto some record, it load
record into RAM.
Valentina in contrast do more smart things. It NOT load BLOB data when you
goto record. So never mind how many and how big BLOB data you have in table,
this NOT affect speed of iteration.
Even more cool. Valentina have such structure of tables, that any column
operation as INDEXING, LIKE, search ... DO NOT depend on the size and the
number of other fields in table.
So if you'd use Valentina you just did not see this problem with BLOBs.
With SqlLite workaround can be extracting of image data into separate Table.
>
If the image data is a 1:1 relationship will performance be the same with 1
>
data store versus 2 (or more?).
I predict that you will get that 2 seconds.
>
I want my app to be able to handle upto around 50,000 records and images. The
>
store size would probably be around 800 MB in all.
But you still have problem Simon.
5000 records => 2 seconds
50,000 records => 20 seconds.
In fact you will get even worse -- about 40-50 seconds I think.
And this is on computer with RAM 1Gb.
On computers with 512MB results should be even worse.
Because size of 800MB is near critical == it is comparable or bigger of RAM.
Simon, I recommend you before you do any steps, spend time and build such
dummy 50,000 records to see what you get. Then you will have on hands facts
to think about...
--
Best regards,
Ruslan Zasukhin
VP Engineering and New Technology
Paradigma Software, Inc
Valentina - Joining Worlds of Information
http://www.paradigmasoft.com
[I feel the need: the need for speed]
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Cocoa-dev mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden