Re: Performance issues with large binary data in database
Re: Performance issues with large binary data in database
- Subject: Re: Performance issues with large binary data in database
- From: Hugi Thordarson via Webobjects-dev <email@hidden>
- Date: Wed, 11 Sep 2019 09:32:42 +0000
Hi Markus,
when I had problems like this it was usually because I was "accidentally"
fetching a lot of the blob-containing objects (like EOF firing a relationship
to check delete rules when performing a delete).
But I'd start by attaching a profiler like VisualVM to your application to see
where processing power/memory is being consumed. If that doesn't reveal
anything interesting, you might try unmodeling the BLOB attribute and either
(a) create a cover method on your entity class that just does an SQL query for
the blob data or
(b) create a separate entity for the same table that only contains the BLOB
attribute, and some identifying attributes you can use to fetch the data (no
relationships).
Shots in the dark though :). If you have control over the DB schema I'd move
the blob field to a separate table (or move the data to the filesystem, but
sounds like that's out of the question).
Cheers,
- hugi
> On 11 Sep 2019, at 09:12, Markus Ruggiero via Webobjects-dev
> <email@hidden> wrote:
>
> A customer of mine has issues with performance. We have identified that
> ec.saveChanges() sometimes takes ages to finish (tens of seconds, sometimes a
> minute or more). The application is basically a document hub, nothing too
> fancy. But the developer (who is not available anymore) once decided to store
> all documents in the database (PostgreSQL). The docs are primarily PDFs with
> sizes of several 100k up to multi mega bytes. The problems cannot be tied
> directly to the size of the documents, sometimes a large doc goes through in
> a couple seconds whereas sometimes a smaller one takes ages. We have found
> that the bottelneck must be inside the editing context, the database
> statements go through nicely. I think performance gets lost during ec
> snapshot handling in conjunction with JVM memory requirements and garbage
> collection, but we have not been able to really pinpoint the location where
> time is iost.
>
> I tend to recommend to the customer to store the files on disk and only keep
> metadata in the database relieving the editing context from handling multi
> mega byte snapshots.
>
> So my questions to the community are as follows:
> - anyone has experience storing multi-megabyte binary data in the database?
> - how would one analyse such a situation (where EOEditingContext et.al is not
> debuggable due to lack of source)?
> - what would you recommend?
>
> Thanks for any tips.
> ---markus---
> _______________________________________________
> Do not post admin requests to the list. They will be ignored.
> Webobjects-dev mailing list (email@hidden)
> Help/Unsubscribe/Update your Subscription:
>
> This email sent to email@hidden
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Webobjects-dev mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden