• Open Menu Close Menu
  • Apple
  • Shopping Bag
  • Apple
  • Mac
  • iPad
  • iPhone
  • Watch
  • TV
  • Music
  • Support
  • Search apple.com
  • Shopping Bag

Lists

Open Menu Close Menu
  • Terms and Conditions
  • Lists hosted on this site
  • Email the Postmaster
  • Tips for posting to public mailing lists
Re: Gaussian blur with core image, using CPU or GPU?
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Gaussian blur with core image, using CPU or GPU?


  • Subject: Re: Gaussian blur with core image, using CPU or GPU?
  • From: Vijay Malhan <email@hidden>
  • Date: Fri, 23 May 2008 20:58:04 +0530


On 23-May-08, at 7:41 PM, Jordan Woehr wrote:

First, I sent this once but I don't think it made it onto the list. I've
done a quick search of the archives and couldn't find it. I apologize in
advance if this end up being a double post.


Hi everyone,

I'm trying to write a bilateral filter using Core Image with the
specific goal of having it preform the filter on the GPU for high
performance as this will be for large 4d data sets.

I started out by reading the Core Image Programming Guide and came to
the "Writing Nonexecutable Filters" page and came across this
sentence:

"Core Image assumes that the ROI coincides with the domain of
definition. This means that nonexecutable filters are not suited for
such effects as blur or distortion."

Does this mean that it is not possible to write a bilateral filter
which does the computations on the GPU?

I've looked at the Core Image gaussian filter. Thus far I cannot find
out whether it is executed on the CPU or GPU and I was wondering if
there is a way to determine this. Is there source code available for
this filter and if so where is it? I have had no luck finding it so
far.

Have you gone through the FunHouse sample app in development examples.
I tried this for you:
I used FunHouse to apply the Gausian Blur (It uses Core Image) on a very heavy image.
I had Activity Monitor opened showing me the CPU usage. With FunHouse, there was no evident increase in CPU usage.


But then on the same heavy Image I used PhotoShop to apply Gausian Blur. There was clear shoot in CPU usage.
So, I think this kinda shows, that the FunHouse implementation, which uses Core Image uses GPU for processing.
I'm using MacBook Pro with ATI Radeon X1600 Graphics Card.
See if this helps.


There is another way of tiling the huge image data-set for better processing. I dun remember right now, but I'll come back to you with that.



Also, if anyone has experience with this type of filter and could
point me in the right direction for implementing it with Core Image it
would be much appreciated.

Lots of questions I know, but I hope someone can help.

Thank you,
Jordan
_______________________________________________

Cocoa-dev mailing list (email@hidden)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden

_______________________________________________

Cocoa-dev mailing list (email@hidden)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden


References: 
 >Gaussian blur with core image, using CPU or GPU? (From: "Jordan Woehr" <email@hidden>)

  • Prev by Date: Re: inter process NSView sharing
  • Next by Date: Re: IB outlets and NSCollectionViews
  • Previous by thread: Gaussian blur with core image, using CPU or GPU?
  • Next by thread: Re: Gaussian blur with core image, using CPU or GPU?
  • Index(es):
    • Date
    • Thread