Swift and Core Audio
Like many of you, I’ve been knee deep into Swift this week. Once you get beyond the hello world things and try something a bit more complicated, you start to learn the language. So, why not go off the deep end and try to work with what is essentially a C library: Core Audio? Turns out that’s a good way to get the types grokked.
First, don’t try to use core audio in a Swift playground. I wasted a day trying to do this. It doesn’t work yet. So, create a project.
Update: In Swift 2 you can create an AUGraph in a playground. You just need this at the top of your playground.
1 2 |
import XCPlayground XCPlaygroundPage.currentPage.needsIndefiniteExecution = true |
I couldn’t do something as simple as this in the playground:
1 |
var processingGraph:AUGraph = AUGraph() |
So, ok I put that in a Swift class as an instance variable and created it in the init function.
Most core audio functions return a status code which is defined as OSStatus. You need to the type on the var.
1 2 |
var status : OSStatus = 0 status = NewAUGraph(&processingGraph) |
Or, if you want, you can cast noErr like this.
1 |
var status = OSStatus(noErr) |
Here’s an adventure with Boolean.
The function AUGraphIsInitialized is defined like this:
1 |
func AUGraphIsInitialized(inGraph: AUGraph, outIsInitialized: CMutablePointer) > OSStatus |
So, you call it like this:
1 2 3 |
var status : OSStatus = OSStatus(noErr) var outIsInitialized:Boolean = 0 status = AUGraphIsInitialized(self.processingGraph, &outIsInitialized) |
That works. But how do you check it?
Boolean is defined as an CUnsignedChar (in MacTypes.h)
So, you cannot do this:
1 2 3 |
if outIsInitialized { // whatever } |
And you cannot cast it (could not find an overload…)
1 |
var b:Bool = Bool(outIsInitialized) |
or with Swift’s “as”
1 |
var b:Bool = outIsInitialized as Bool |
I’m clearly overthinking this. Because this is all that is needed. D’oh!
1 2 3 4 5 6 |
var outIsInitialized:Boolean = 0 status = AUGraphIsInitialized(self.processingGraph, &outIsInitialized) if outIsInitialized == 0 { status = AUGraphInitialize(self.processingGraph) CheckError(status) } |
Update: In Swift 2, you use DarwinBoolean instead.
1 2 3 4 5 6 |
var outIsInitialized = DarwinBoolean(false) status = AUGraphIsInitialized(self.processingGraph, &outIsInitialized) if !outIsInitialized { status = AUGraphInitialize(self.processingGraph) CheckError(status) } |
Another problem I had was using constants such as kAudioUnitSubType_Sampler while trying to create an AudioComponentDescription. The trick was to simply cast to OSType.
1 2 3 4 5 6 |
var cd = AudioComponentDescription(componentType: OSType(kAudioUnitType_MusicDevice), componentSubType: OSType(kAudioUnitSubType_Sampler), componentManufacturer: OSType(kAudioUnitManufacturer_Apple), componentFlags: 0, componentFlagsMask: 0) status = AUGraphAddNode(self.processingGraph, &cd, &samplerNode) CheckError(status) |
Thanks for this!
Please post any thoughts on how to write to an AVAudioPCMBuffer. I’m stuck.
Bob, send me the code what you’re trying to do. Are you generating samples in code?
You’d do use the floatChannelData variable
Hi Gene,
Yeah, Im trying to generate samples into code. I can produce sound, but when I try and change the frequency variable I get a modulation effect. The other strange thing is when I change the frame count, the pitch changes. I’m guessing that has something to do with the buffer being looped.
Thanks for any help.
Bob
Hi Gene, Many thanks for posting this.
I am translating an Objective-C application which – like you – I am using to kick Swift’s tyres.
I had reached the complicated bit of calling the Audio library, and was a bit daunted. But your article has pointed the way!
Cheers,
Philip
Most people would write another Weather or To Do app to learn a new language.
Not us.
We do the hardest thing on the platform. 🙂
Glad to be of help
Gene, thanks for the article. I think you’re representing a lot of people, like myself and others above, who take CoreAudio seriously. I esp. take it seriously because I’m using iOS to improve my hearing and also “try” and mask my tinnitus (I’ve tried so many hearing aids; the industry is knee deep in earning huge gross margins – mfg’s and audiologists; not in quality).
Swift and the Swift Playground, REPL, are much more conducive to researching new ideas much like OOPS like Smalltalk and Actor once did. Please add me to your mail list if you should continue to get deeper into the subject. I’m looking into trying to bridge STK (Synthesis Toolkit).
If you find any other resources or books to be published, I’d love to know. Thx.
Thanks for this. A lot. I even thought I’d use that gittip button but it didn’t work… 🙂
How about for core audio’s AudioUnitSetProperty function. I cannot seem to convert this to swift.
Here is an example:
Hello Gene De Lisa!
Do you know, is it possible to play AVAudioPlayer’s MPMediaItem (Apple Music content) with equalizer?