• Open Menu Close Menu
  • Apple
  • Shopping Bag
  • Apple
  • Mac
  • iPad
  • iPhone
  • Watch
  • TV
  • Music
  • Support
  • Search apple.com
  • Shopping Bag

Lists

Open Menu Close Menu
  • Terms and Conditions
  • Lists hosted on this site
  • Email the Postmaster
  • Tips for posting to public mailing lists
Re: CoreImage problems with very large images
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: CoreImage problems with very large images


  • Subject: Re: CoreImage problems with very large images
  • From: Ilan Volow <email@hidden>
  • Date: Fri, 16 Nov 2007 12:17:14 -0500

If the image processing required lends itself well to parallel processing and you've got a bunch of fast computers lying around, you may want to investigate the possibility of using Xgrid to send the tiles out to multiple machines.

-- Ilan

On Nov 16, 2007, at 9:17 AM, Chris Blackburn wrote:

Hey,

The image you are trying to process dexpmpresses to an image that is about 968 megabytes in size. This is almost certainly bigger than your graphics card VRAM. Assuming that core image is programmer cleverly it will take the image and break it down into tiles that overlap slightly and reconstruct the image afterwards. This won't be really slow but it may well chew up a gig of ram in the process.

If core image is not implemented to deal with large images it might just give up on the GPU, load the whole lot into RAM and process using Altivec or SSE. This may well be very very slow indeed.

You might want to try tiling and reconstructing the image yourself so you never overstep your VRAM.

Chris


On 16 Nov 2007, at 08:07, Jim Crate <email@hidden> wrote:

I'm using CoreImage in an app, but it is unusable with very large images, e.g. a 52-megabyte 18,000 x 14,000 jpg. A 5000 x 4000 jpg image loads and processes with great performance, but trying to process the large image leads to very high memory use and much swapping to disk. I tried to load the large image in the Core Image Fun House example app, and ended up force-quitting after waiting 10-15 minutes.

The old implementation using NSImage could load and process (scale, rotate, watermark, broken grayscale) the large images, and while it wasn't fast, it was at least usable.

Is CoreImage just not meant to handle images this big? Or are there ways to gain acceptable performance with these large images with CoreImage? Will the problem possibly go away on a Mac Pro?

Thanks,

Jim

_______________________________________________

Cocoa-dev mailing list (email@hidden)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
40gmail.com


This email sent to email@hidden
_______________________________________________

Cocoa-dev mailing list (email@hidden)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden

Ilan Volow "Implicit code is inherently evil, and here's the reason why:"



_______________________________________________

Cocoa-dev mailing list (email@hidden)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden


References: 
 >CoreImage problems with very large images (From: Jim Crate <email@hidden>)
 >Re: CoreImage problems with very large images (From: Chris Blackburn <email@hidden>)

  • Prev by Date: Re: iconForFile
  • Next by Date: row numbers participating in a table drag
  • Previous by thread: Re: CoreImage problems with very large images
  • Next by thread: Re: CoreImage problems with very large images
  • Index(es):
    • Date
    • Thread