• Open Menu Close Menu
  • Apple
  • Shopping Bag
  • Apple
  • Mac
  • iPad
  • iPhone
  • Watch
  • TV
  • Music
  • Support
  • Search apple.com
  • Shopping Bag

Lists

Open Menu Close Menu
  • Terms and Conditions
  • Lists hosted on this site
  • Email the Postmaster
  • Tips for posting to public mailing lists
Noob Questions
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Noob Questions


  • Subject: Noob Questions
  • From: Matt Mashyna <email@hidden>
  • Date: Tue, 8 Jan 2008 13:51:42 -0500

Sorry for the complete noob questions. I spent the weekend trying to find a starting place with Core Audio but I didn't find what I was looking for. I'm hoping some more experienced developers can help me find a place to start.

I've been a software developer for a long time... before there were Macs. I put music on the back burner for 25+ years. I'm so out of it that my sight reading is complete crap now and I wanted to find an application that would let me connect my bass, or other instrument with a digital interface, to my Mac, play notes and get feedback from the application; did I hit the right note at the right time kind of thing. I looked around but I haven't found just what I'm looking for. Since I'm pretty good at developing software I figure I can do it myself.

So, what I *think* I want to do is use Core Audio to get input from an instrument, use pitch detection to figure out if the note is close enough to the note that should be played and tell the user that it was correct or incorrect. I looked at the Core Audio examples and they show how to process audio, as Audio Units, through a pipeline architecture. What I'm having a hard time understanding is how my application fits the pipe-line model. I don't want to output audio, I want to get an audio stream, analyze it and give feedback to the user about it. It doesn't seem to really fit in as an Audio Unit but maybe I just don't get it.

Can anyone offer advice about how I should approach what I want to do using Core Audio ? Do I want to build it as an Audio Unit or use some other paradigm ? I was thinking that if it was a AU, a cocoa host app could query the AU for what note was currently being played. Does that make sense ?


Thanks, Matt _______________________________________________ Do not post admin requests to the list. They will be ignored. Coreaudio-api mailing list (email@hidden) Help/Unsubscribe/Update your Subscription: This email sent to email@hidden
  • Follow-Ups:
    • Re: Noob Questions
      • From: Jens Alfke <email@hidden>
  • Prev by Date: Re: Cocoa AU template, AUParameterSet and Logic automation
  • Next by Date: Re: Noob Questions
  • Previous by thread: Re: Cocoa AU template, AUParameterSet and Logic automation
  • Next by thread: Re: Noob Questions
  • Index(es):
    • Date
    • Thread