Re: AudioServerPlugin crashing
Re: AudioServerPlugin crashing
- Subject: Re: AudioServerPlugin crashing
- From: Jeff Moore <email@hidden>
- Date: Thu, 21 Mar 2013 10:37:07 -0700
There were bug fixes to the server plug-in hosting in 10.8.3. FWIW, the system uses server plug-ins to handle a few device types itself without any issues I'm aware.
At any rate, the right thing to do when you have a question like this is to file a bug in BugReporter for us to look at.
--
Jeff Moore
Core Audio
Apple
On Mar 20, 2013, at 7:10 PM, Tuviah Snyder <email@hidden> wrote:
> I've noticed that on 10.8.3, my audio plugin which works correctly with 10.8.2 now crashes. I debugged and it appears to pass extremely large values to DoIOOperation.
>
> If I attempt to memset the buffer using ioMainBuffer and inIOBufferFrameSize it crashes. If I do nothing it crashes.
>
> Anyone have a working user mode audio plugin under 10.8.3?
>
> Were there major changes I'm not aware of?
>
> best,
> Tuviah
>
> On Mar 20, 2013, at 12:00 PM, email@hidden<mailto:email@hidden> wrote:
>
> Send Coreaudio-api mailing list submissions to
> email@hidden<mailto:email@hidden>
>
> To subscribe or unsubscribe via the World Wide Web, visit
> https://lists.apple.com/mailman/listinfo/coreaudio-api
> or, via email, send a message with subject or body 'help' to
> email@hidden
>
> You can reach the person managing the list at
> email@hidden
>
> When replying, please edit your Subject line so it is more specific
> than "Re: Contents of Coreaudio-api digest..."
>
>
> Today's Topics:
>
> 1. JACKiOS is here (Paul Davis)
> 2. Re: MusicEventIterator questions (Aran Mulholland)
> 3. Re: JACKiOS is here (Aran Mulholland)
> 4. MusicPlayer not sending Note OFF ? (Pablo Ansonia)
> 5. Re: MusicPlayer not sending Note OFF ? (Admiral Quality)
> 6. Dolby encoded audio (Phil Montoya)
> 7. Re: Dolby encoded audio (Jeff Moore)
> 8. Re: Dolby encoded audio (Phil Montoya)
> 9. Re: Dolby encoded audio (Jeff Moore)
>
>
> ----------------------------------------------------------------------
>
> Message: 1
> Date: Tue, 19 Mar 2013 17:19:17 -0400
> From: Paul Davis <email@hidden>
> To: CoreAudio API <email@hidden>
> Subject: JACKiOS is here
> Message-ID:
> <CAFa_cKnEcsG-haoW+RY2Tae=email@hidden>
> Content-Type: text/plain; charset="iso-8859-1"
>
> I'm not one of the developer involved with the port or packaging or
> distribution of JACKiOS, but I thought it would be important to let all of
> you folks know that JACK for iOS is now available in the app store.
>
>
> https://itunes.apple.com/us/app/jack-audio-connection-kit/id615485734?mt=8
>
> More information can be found here:
>
> http://www.crudebyte.com/jack-ios/
>
> What is JACK? In a nutshell: low latency, synchronous interconnect of
> audio and MIDI data streams between applications and audio devices. Its
> also been around the block a few times: JACK was first developed on Linux
> in 2003, was ported to OS X in 2004 and windows in 2008 (roughly speaking).
> The API for applications is identical across all platforms.
> -------------- next part --------------
> An HTML attachment was scrubbed...
> URL: <https://lists.apple.com/mailman/private/coreaudio-api/attachments/20130319/8e933f70/attachment.html>
>
> ------------------------------
>
> Message: 2
> Date: Wed, 20 Mar 2013 08:30:42 +1100
> From: Aran Mulholland <email@hidden>
> To: Ross Bencina <email@hidden>
> Cc: Core Audio Mailing List <email@hidden>
> Subject: Re: MusicEventIterator questions
> Message-ID:
> <CAB9YEfBnQDX1ZzydOmeh7Pau2WOnFAUw7PxK0OB1uW7wptw4=email@hidden>
> Content-Type: text/plain; charset="iso-8859-1"
>
> Thanks Ross and Paul,
>
> What you describe is kind of what I have started writing.
>
> I have a structure that is an ordered linked list built from the UI thread.
> Then I have another routine that builds a structure for the audio thead to
> consume. They are slightly different structures as I wanted the audio
> thread to have an array so I can perform a binary search to get
> 'interesting' events for the current time period being rendered.
>
> Paul's idea of having a windowed snapshot of the data is an interesting one
> that I may consider as I have not yet worked out how to give the audio
> thread the most up to date array or how to dispose of stale arrays, but I'm
> having fun. Best problem I've worked on in ages.
>
> I guess I'm most interested in how to pass the audio thread the latest
> array. I'll have a look at that article you wrote.
>
> Thanks
>
>
> On Wed, Mar 20, 2013 at 3:26 AM, Ross Bencina <email@hidden>wrote:
>
> On 20/03/2013 1:05 AM, Paul Davis wrote:
>
> On Tue, Mar 19, 2013 at 7:42 AM, Ross Bencina
> <email@hidden <mailto:rossb-lists@**audiomulch.com<email@hidden>>>
> wrote:
>
> Hi Aran,
>
> I agree with Paul that if the idea is to have a single mutable data
> structure that is accessible from multiple threads that's kind of
> hard.
>
> But the way I would do it is have two single-threaded data
> structures and propogate changes via a FIFO:
>
>
> there is an alternate approach too that also uses a FIFO. the "audio"
> thread (running in an RT context, the one where rendering is done)
> doesn't need to know about anything that, well, anything that it doesn't
> need to know about. it only cares about "now", meaning specific data
> required to render audio for the current block of time.
>
>
> Modulo some "schedule ahead" extra safety margin. You're basically trading
> latency for ease of implementation.
>
>
>
> so you can leave your fancy-pants data structure in non-RT land, and
> just make sure that a "linearized" version of gets pushed into a FIFO
> where the audio thread can pull from. that way, you can use mutexes or
> whatever synchronization mechanisms you want in non-RT land, and the
> audio thread just gets a simplified, accurate, "windowed" snapshot of it
> to use for rendering.
>
>
> This requires to choose the window length correctly. It needs to be long
> enough to anticipate any disruption in the non-real-time context, and short
> enough to retain interactive response.
>
> This is going to work better for some kind of applications than others.
>
>
>
> of course, this gets complex when you jump around on the timeline and
> have to refill the FIFO correctly,
>
>
> It also gets difficult if you want to have lowest latency MIDI control
> over algorithmic sequence generation (for example).
>
> That said, it's what Supercollider3 does (splits lang and even scheduling
> into a seperate thread from audio).
>
>
>
> and in this sense, Ross' idea has a
> different kind of simplicity on its side.
>
>
> It's definitely a "kind of" simplicity :)
>
> Ross.
>
> ______________________________**_________________
> Do not post admin requests to the list. They will be ignored.
> Coreaudio-api mailing list (email@hidden**)
> Help/Unsubscribe/Update your Subscription:
> https://lists.apple.com/**mailman/options/coreaudio-api/**
>
> This email sent to email@hidden
>
> -------------- next part --------------
> An HTML attachment was scrubbed...
> URL: <https://lists.apple.com/mailman/private/coreaudio-api/attachments/20130320/3adf1d97/attachment.html>
>
> ------------------------------
>
> Message: 3
> Date: Wed, 20 Mar 2013 08:32:34 +1100
> From: Aran Mulholland <email@hidden>
> To: Paul Davis <email@hidden>
> Cc: CoreAudio API <email@hidden>
> Subject: Re: JACKiOS is here
> Message-ID:
> <CAB9YEfBzVZb1z38YkxEC7yW-8oSJMeXEwFwz=email@hidden>
> Content-Type: text/plain; charset="iso-8859-1"
>
> looks like audiobus has a competitor...
>
>
> On Wed, Mar 20, 2013 at 8:19 AM, Paul Davis <email@hidden>wrote:
>
> I'm not one of the developer involved with the port or packaging or
> distribution of JACKiOS, but I thought it would be important to let all of
> you folks know that JACK for iOS is now available in the app store.
>
>
> https://itunes.apple.com/us/app/jack-audio-connection-kit/id615485734?mt=8
>
> More information can be found here:
>
> http://www.crudebyte.com/jack-ios/
>
> What is JACK? In a nutshell: low latency, synchronous interconnect of
> audio and MIDI data streams between applications and audio devices. Its
> also been around the block a few times: JACK was first developed on Linux
> in 2003, was ported to OS X in 2004 and windows in 2008 (roughly speaking).
> The API for applications is identical across all platforms.
>
>
>
> _______________________________________________
> Do not post admin requests to the list. They will be ignored.
> Coreaudio-api mailing list (email@hidden)
> Help/Unsubscribe/Update your Subscription:
>
>
> This email sent to email@hidden
>
>
> -------------- next part --------------
> An HTML attachment was scrubbed...
> URL: <https://lists.apple.com/mailman/private/coreaudio-api/attachments/20130320/1168193e/attachment.html>
>
> ------------------------------
>
> Message: 4
> Date: Tue, 19 Mar 2013 18:02:59 -0400
> From: Pablo Ansonia <email@hidden>
> To: email@hidden
> Subject: MusicPlayer not sending Note OFF ?
> Message-ID:
> <CAJWvs3y_ZAbb=email@hidden>
> Content-Type: text/plain; charset=ISO-8859-1
>
> Hi all,
>
> iOS: successfully loaded a MIDI file with MusicPlayer to an endPoint.
> The Note On messages are arriving fine in my MIDIReadProc and I am
> forwarding them on to the AudioUnit sampler (with Vibes AUPreset).
>
> Even though the sampler seems to snub the notes correctly, I'm not
> seeing any MIDI Note OFF messages/packets in the ReadProc (I have
> verified that they are in the file). Is there some "back channel" of
> communication?
>
> Any pointers would be greatly appreciated,
>
> Thanks,
> PA
>
>
> ------------------------------
>
> Message: 5
> Date: Tue, 19 Mar 2013 18:11:15 -0400
> From: Admiral Quality <email@hidden>
> To: email@hidden
> Subject: Re: MusicPlayer not sending Note OFF ?
> Message-ID:
> <CAM5-hCyu9V5cFwi6guaBDBa+L_3Tc7BCo=email@hidden>
> Content-Type: text/plain; charset=ISO-8859-1
>
> MIDI notes with a velocity of zero are the same as MIDI note off.
> (This is to enable running status, which saves a bit of bandwidth on
> the old, slow, MIDI cable. Only the first event status message needs
> to be sent for the first note-on in a bunch, then after that all data
> bytes received until the next status byte is received can be assumed
> to all be notes, with zero velocity meaning the same thing as
> note-off, but not requiring a new event status to be sent.)
>
> - Mike/AQ
>
> On Tue, Mar 19, 2013 at 6:02 PM, Pablo Ansonia <email@hidden> wrote:
> Hi all,
>
> iOS: successfully loaded a MIDI file with MusicPlayer to an endPoint.
> The Note On messages are arriving fine in my MIDIReadProc and I am
> forwarding them on to the AudioUnit sampler (with Vibes AUPreset).
>
> Even though the sampler seems to snub the notes correctly, I'm not
> seeing any MIDI Note OFF messages/packets in the ReadProc (I have
> verified that they are in the file). Is there some "back channel" of
> communication?
>
> Any pointers would be greatly appreciated,
>
> Thanks,
> PA
> _______________________________________________
> Do not post admin requests to the list. They will be ignored.
> Coreaudio-api mailing list (email@hidden)
> Help/Unsubscribe/Update your Subscription:
>
> This email sent to email@hidden
>
>
> ------------------------------
>
> Message: 6
> Date: Wed, 20 Mar 2013 00:24:28 +0000
> From: Phil Montoya <email@hidden>
> To: "email@hidden" <email@hidden>
> Subject: Dolby encoded audio
> Message-ID: <3338DB9AC290E542BE5E81F1C9F5E05F3353F9B7@aja-ex2010-01>
> Content-Type: text/plain; charset=us-ascii
>
> I'm wondering if it is possible to support Dolby encoded audio in a core audio driver? I don't see Dolby as a choice for the sampleFormat in the IOAudioStreamFormat.
>
> -Phil
>
>
> ------------------------------
>
> Message: 7
> Date: Tue, 19 Mar 2013 17:29:33 -0700
> From: Jeff Moore <email@hidden>
> To: "email@hidden" <email@hidden>
> Subject: Re: Dolby encoded audio
> Message-ID: <email@hidden>
> Content-Type: text/plain; charset=us-ascii
>
> Dolby is the format that is called kIOAudioStreamSampleFormat1937AC3 in the header. The format is a bitstream that would be sent over a SPDIF cable.
>
> --
>
> Jeff Moore
> Core Audio
> Apple
>
>
>
> On Mar 19, 2013, at 5:24 PM, Phil Montoya <email@hidden> wrote:
>
> I'm wondering if it is possible to support Dolby encoded audio in a core audio driver? I don't see Dolby as a choice for the sampleFormat in the IOAudioStreamFormat.
>
>
>
>
> ------------------------------
>
> Message: 8
> Date: Wed, 20 Mar 2013 00:49:08 +0000
> From: Phil Montoya <email@hidden>
> To: Jeff Moore <email@hidden>
> Cc: "email@hidden" <email@hidden>
> Subject: Re: Dolby encoded audio
> Message-ID: <3338DB9AC290E542BE5E81F1C9F5E05F3353FAAF@aja-ex2010-01>
> Content-Type: text/plain; charset=us-ascii
>
> Very nice! So if we publish an IOAudioStream format with the Dolby sample format, what will we get in our clip call? Will it be Dolby encoded audio data? I ask this because PCM formats arrive as floats so we do have to do some conversion. I'm wondering what type of conversion (if any) is needed for Dolby?
>
> -Phil
>
>
> On Mar 19, 2013, at 5:29 PM, Jeff Moore <email@hidden> wrote:
>
> Dolby is the format that is called kIOAudioStreamSampleFormat1937AC3 in the header. The format is a bitstream that would be sent over a SPDIF cable.
>
> --
>
> Jeff Moore
> Core Audio
> Apple
>
>
>
> On Mar 19, 2013, at 5:24 PM, Phil Montoya <email@hidden> wrote:
>
> I'm wondering if it is possible to support Dolby encoded audio in a core audio driver? I don't see Dolby as a choice for the sampleFormat in the IOAudioStreamFormat.
>
>
> _______________________________________________
> Do not post admin requests to the list. They will be ignored.
> Coreaudio-api mailing list (email@hidden)
> Help/Unsubscribe/Update your Subscription:
>
> This email sent to email@hidden
>
>
>
>
> ------------------------------
>
> Message: 9
> Date: Tue, 19 Mar 2013 19:23:51 -0700
> From: Jeff Moore <email@hidden>
> To: "email@hidden" <email@hidden>
> Subject: Re: Dolby encoded audio
> Message-ID: <email@hidden>
> Content-Type: text/plain; charset=us-ascii
>
> Yes. The data that comes down to the driver will be AC-3. Note that because the format is not mixable, there are parts of the IOAudio family's data transfer pipe that get skipped, particularly the mixing parts.
>
> --
>
> Jeff Moore
> Core Audio
> Apple
>
>
>
> On Mar 19, 2013, at 5:49 PM, Phil Montoya <email@hidden> wrote:
>
> Very nice! So if we publish an IOAudioStream format with the Dolby sample format, what will we get in our clip call? Will it be Dolby encoded audio data? I ask this because PCM formats arrive as floats so we do have to do some conversion. I'm wondering what type of conversion (if any) is needed for Dolby?
>
> -Phil
>
>
> On Mar 19, 2013, at 5:29 PM, Jeff Moore <email@hidden> wrote:
>
> Dolby is the format that is called kIOAudioStreamSampleFormat1937AC3 in the header. The format is a bitstream that would be sent over a SPDIF cable.
>
> --
>
> Jeff Moore
> Core Audio
> Apple
>
>
>
> On Mar 19, 2013, at 5:24 PM, Phil Montoya <email@hidden> wrote:
>
> I'm wondering if it is possible to support Dolby encoded audio in a core audio driver? I don't see Dolby as a choice for the sampleFormat in the IOAudioStreamFormat.
>
>
> _______________________________________________
> Do not post admin requests to the list. They will be ignored.
> Coreaudio-api mailing list (email@hidden)
> Help/Unsubscribe/Update your Subscription:
>
> This email sent to email@hidden
>
>
>
>
> ------------------------------
>
> _______________________________________________
> Coreaudio-api mailing list
> email@hidden
> https://lists.apple.com/mailman/listinfo/coreaudio-api
>
> End of Coreaudio-api Digest, Vol 10, Issue 97
> *********************************************
>
>
> _______________________________________________
> Do not post admin requests to the list. They will be ignored.
> Coreaudio-api mailing list (email@hidden)
> Help/Unsubscribe/Update your Subscription:
>
> This email sent to email@hidden
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden