Re: CoreAudio vs CoreData
Re: CoreAudio vs CoreData
- Subject: Re: CoreAudio vs CoreData
- From: Paul Davis <email@hidden>
- Date: Thu, 13 Oct 2011 20:54:32 -0400
On Thu, Oct 13, 2011 at 6:50 PM, patrick machielse <email@hidden> wrote:
> Op 14 okt. 2011, om 00:05 heeft Paul Davis het volgende geschreven:
>
>> On Thu, Oct 13, 2011 at 5:54 PM, patrick machielse <email@hidden> wrote:
>>
>>> Each file has an associated 'volume envelope'; an array of points describing file-playback-volume(time). These points are stored as a set of managed objects in the CoreData document. While playing back the document, the current playback volume must be interpolated from these envelope points for each file on each render callback. This might / will translate to a fetch of these points. -[NSManagedObjectConttext executeFetchRequest:] requires locking when performed from multiple threads.
>>
>> lets try to put this more directly then. You cannot use CoreData
>> within the render method. Its that simple. using CoreData is fine, you
>> you cannot use it in that context.
>
>
> Paul,
>
> If that is the case, could you perhaps suggest a suitable solution for the usage case above? How could we circumvent accessing CoreData there -- without bolting on a terrible kludge?
suck the data out ahead of time, and make it available to the render
callback via some other (lock- and block-free) method.
CoreData is not a suitable place to store dynamic, realtime data
(where "realtime" is in the audio sense, not the financial trading
sense). if you need to "store" modifications by the user, don't go
directly to CoreData except as a serialization technique.
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden