Swift: AUGraph and MusicSequence
Swift AUGraph and MusicSequence
The AudioToolbox MusicSequence remains the only way to create a MIDI Sequence programmatically. The AVFoundation class AVMIDIPlayer will play a MIDI file, but not a MusicSequence.
AVAudioEngine has a musicSequence property. It doesn’t seem to do anything yet (except crash when you set it). So the way to get a MusicSequence to play with instrument sounds is to create a low level core audio AUGraph and play the sequence with a MusicPlayer.
Create a MusicSequence
MusicPlayer create
Playing a MusicSequence
Create an AUGraph
Create sampler
Create IO node
Obtain Audio Units
Wiring
Starting the AUGraph
Soundfont
Summary
Resources
Introduction
Apple is moving towards a higher level Audio API with AVFoundation. The AVAudioEngine looks promising, but it is incomplete. Right now there isn’t a way to associate an AudioToolbox MusicSequence with it. So, here I’ll use a low level Core Audio AUGraph for the sounds.
Create a MusicSequence
Let’s start by creating a MusicSequence with a MusicTrack that contains several MIDINoteMessages.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 |
var musicSequence:MusicSequence = MusicSequence() var status = NewMusicSequence(&musicSequence) if status != OSStatus(noErr) { println("\(__LINE__) bad status \(status) creating sequence") } // add a track var track = MusicTrack() status = MusicSequenceNewTrack(musicSequence, &track) if status != OSStatus(noErr) { println("error creating track \(status)") } // now make some notes and put them on the track var beat = MusicTimeStamp(1.0) for i:UInt8 in 60...72 { var mess = MIDINoteMessage(channel: 0, note: i, velocity: 64, releaseVelocity: 0, duration: 1.0 ) status = MusicTrackNewMIDINoteEvent(track, beat, &mess) if status != OSStatus(noErr) { println("error creating midi note event \(status)") } beat++ } |
MusicPlayer create
Now you need a MusicPlayer to hear it. Let’s make one give it out MusicSequence.
Here, I “pre roll” the player for fast startup when you hit a play button. You don’t have to do this,
but here is the way to do it.
1 2 3 4 5 6 7 8 9 10 11 12 13 |
var musicPlayer = MusicPlayer() var status = NewMusicPlayer(&musicPlayer) if status != OSStatus(noErr) { println("bad status \(status) creating player") } status = MusicPlayerSetSequence(musicPlayer, musicSequence) if status != OSStatus(noErr) { println("setting sequence \(status)") } status = MusicPlayerPreroll(musicPlayer) if status != OSStatus(noErr) { println("prerolling player \(status)") } |
Playing a MusicSequence
Finally, you tell the player to play like this – probably from an IBAction.
1 2 3 4 5 |
status = MusicPlayerStart(musicPlayer) if status != OSStatus(noErr) { println("Error starting \(status)") return } |
Wonderful sine waves! What if you want to hear something that approximates actual instruments?
Well, you can load SoundFont or DLS banks – or even individual sound files. Here, I’ll load a SoundFont.
Load it into what? Well, here I’ll load it into a core audio sampler – an AudioUnit. That means I’ll need to create a core audio AUGraph.
The end of the story is this, you associate an AUGraph with the MusicSequence like this.
1 |
MusicSequenceSetAUGraph(musicSequence, self.processingGraph) |
Create an AUGraph
Great. So how do you make an AUGraph? If you want a bit more detail, look at my blog post on it using Objective-C. Here, I’ll just outline the steps.
Create the AUGraph with NewAUGraph. It is useful to define it as an instance variable.
1 2 3 |
var processingGraph:AUGraph var status = NewAUGraph(&self.processingGraph) |
Create sampler
To create the sampler and add it to the graph, you need to create an AudioComponentDescription.
1 2 3 4 5 6 7 8 9 |
var samplerNode:AUNode var cd:AudioComponentDescription = AudioComponentDescription( componentType: OSType(kAudioUnitType_MusicDevice), componentSubType: OSType(kAudioUnitSubType_Sampler), componentManufacturer: OSType(kAudioUnitManufacturer_Apple), componentFlags: 0, componentFlagsMask: 0) status = AUGraphAddNode(self.processingGraph, &cd, &samplerNode) |
Create IO node
Create an output node in the same manner.
1 2 3 4 5 6 7 |
var ioUnitDescription:AudioComponentDescription = AudioComponentDescription( componentType: OSType(kAudioUnitType_Output), componentSubType: OSType(kAudioUnitSubType_RemoteIO), componentManufacturer: OSType(kAudioUnitManufacturer_Apple), componentFlags: 0, componentFlagsMask: 0) status = AUGraphAddNode(self.processingGraph, &ioUnitDescription, &ioNode) |
Obtain Audio Units
Now to wire the nodes together and init the AudioUnits. The graph needs to be open, so we do that first.
Then I obtain references to the audio units with the function AUGraphNodeInfo.
1 2 3 4 5 6 7 8 |
var samplerUnit:AudioUnit var ioUnit:AudioUnit status = AUGraphOpen(self.processingGraph) status = AUGraphNodeInfo(self.processingGraph, self.samplerNode, nil, &samplerUnit) status = AUGraphNodeInfo(self.processingGraph, self.ioNode, nil, &ioUnit) |
Wiring
Now wire them using AUGraphConnectNodeInput.
1 2 3 4 5 |
var ioUnitOutputElement:AudioUnitElement = 0 var samplerOutputElement:AudioUnitElement = 0 status = AUGraphConnectNodeInput(self.processingGraph, self.samplerNode, samplerOutputElement, // srcnode, inSourceOutputNumber self.ioNode, ioUnitOutputElement) // destnode, inDestInputNumber |
Starting the AUGraph
Now you can initialize and start the graph.
1 2 3 4 5 6 7 8 9 10 11 12 |
var status : OSStatus = OSStatus(noErr) var outIsInitialized:Boolean = 0 status = AUGraphIsInitialized(self.processingGraph, &outIsInitialized) if outIsInitialized == 0 { status = AUGraphInitialize(self.processingGraph) } var isRunning:Boolean = 0 AUGraphIsRunning(self.processingGraph, &isRunning) if isRunning == 0 { status = AUGraphStart(self.processingGraph) } |
Soundfont
Go ahead and play your MusicSequence now. Crap. Sine waves again. Well yeah, we didn’t load any sounds!
Let’s create a function to load a SoundFont, then use a “preset” from that font on the sampler unit. You need to fill out a AUSamplerInstrumentData struct. One thing that may trip you up is the fileURL which is an Unmanaged CFURL. Well, NSURL is automatically toll-free-bridged to CFURL. Cool. But it is not Unmanaged, which is what is required. So, here I’m using Unmanaged.passUnretained. If you know a better way, please let me know.
Then we need to set the kAUSamplerProperty_LoadInstrument on our samplerUnit. You do that with AudioUnitSetProperty. The preset numbers are General MIDI patch numbers. In the Github repo, I created a Dictionary of patches for ease of use and an example Picker.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 |
func loadSF2Preset(preset:UInt8) { if let bankURL = NSBundle.mainBundle().URLForResource("GeneralUser GS MuseScore v1.442", withExtension: "sf2") { var instdata = AUSamplerInstrumentData(fileURL: Unmanaged.passUnretained(bankURL), instrumentType: UInt8(kInstrumentType_DLSPreset), bankMSB: UInt8(kAUSampler_DefaultMelodicBankMSB), bankLSB: UInt8(kAUSampler_DefaultBankLSB), presetID: preset) var status = AudioUnitSetProperty( self.samplerUnit, UInt32(kAUSamplerProperty_LoadInstrument), UInt32(kAudioUnitScope_Global), 0, &instdata, UInt32(sizeof(AUSamplerInstrumentData))) CheckError(status) } } |
Summary
You can create a Core Audio AUGraph, attach it to a MusicSequence, and play it.
Resources
My post on Using AUGraph in iOS
Audio Unit Hosting Guide for iOS
Audio Unit Processing Graph Services Reference
Audio Unit Properties Reference
Audio Unit Parameters Reference
Audio Unit Component Services Reference
Music in iOS and Lion WWDC 2011 presentation slides (sign in required)
Learning Core Audio: A Hands-On Guide to Audio Programming for Mac and iOS
Hi,
Great post,only problem i have, the code plays a sine wave.
The project works fine as a standalone, however when i copy into my project , i only hear sine wave.
It appears the AUgraph is not being used.
Could there be something i haven’t copied over from the original files?
Thank you
Z
Did you install a SoundFont?
I fixed the problem, the sound font was not added to the target, just needed to tick a box.
Thank you
Z
Me Again.
Hi Gene De Lisa,
I have a midi music sequence created programmatically, how would i go about saving or exporting
the midi file to an iDevice ? For example as a ringtone, preferably with the particular instrument i have selected.
Any hints or advice is of course very much appreciated.
Z
I’m finding the following is still the case:
AVAudioEngine has a musicSequence property. It doesn’t seem to do anything yet (except crash when you set it)
have you found otherwise yet?
Thank you very much for your work on converting this API to swift. It has saved me many hours of work.
Thank you for the excellent and very helpful guide. I really appreciate it.
I have a question.
I’m using an audio file for the instrument and wanted to know if there is a way to change the start time of that instrument. I can currently change the end time by modifying the duration of the midi note message, but I can’t find a way to start playing the audio file, say, 5 seconds from beginning of the file.
I understand also that every event with this instrument will have this same start time (unless its possible to do it on an event bases).
I found another of your guides about trimming a sound file which also could be an option but that seems like a round about way.
Appreciate any advice
Thanks again!
Might be more trouble than it’s worth (but that’s up to you to judge) but play around with aupresets via AU Lab.
Here is some info.
Let me know if that works for you.
Thank you so much for the quick reply.
I checked out aupresets and it appears its really only useful for assigning audio files to ranges of pitches for one instrument. So nothing there to change start time. I consider that the answer is to use the trim guide. Which I may have questions for you on lol.
Thanks again!
Oh gee, sorry. Worth a shot. Thanks for checking it out.
I guess you’ll just have to fire up Audacity and trim them.
Just wanted to come back and add my current solution for this.
I really appreciate your guides because the documentation on this stuff is not very helpful at times.
So I ended up using a mix of AUGraph/MusicPlayer and AVAudioEngine
In AVAudioEngine you can hook up an AVAudioPlayerNode which has a method:
scheduleSegment:startingFrame:frameCount:atTime:completionHandler:
here you can provide a starting frame.
But as your guide points out I can’t play a sequence with AVAudioEngine so I still use AudioToolBox to play the sequence. When the midi notes want to play tho, if i’m using a single audio file, I use AudioEngine instead to play said file.
This way the user can adjust on the fly without having to export an entire new file as well as use the single file multiple times with different settings (start times for example).
Hope that makes sense and/or helps someone. =)
I’m not sure if I am the only one experiencing this, but MusicSequences that loop fine in iOS 8 seems to stop looping on iOS 9 (I have several MusicSequence projects, but I’ve also tried out this sample code). Have you noticed this also?
The Github project for this post has a slider that controls the loop – even though I didn’t talk about looping here. I just tried it and it seems to work.
If you have your code in a public repo someplace send me the link and I’ll take a look.
Hey Gene,
I have a similar issue as Rosano. As of iOS 9 my MIDI app stopped working. In iOS 7 and 8 it worked great. I spent all day today trying to figure out what’s going. At this point, I’m convinced that it’s an iOS 9 bug. And since you said you don’t experience it, wondering if you could run this code and see if you are experiencing it too:
https://gist.github.com/Nikolozi/d001c11533e689587809
Basically, if I run this code as is iOS will start playing it using its internal synth (since MIDI endpoint is not set), then I hear a few notes playing with crackling sounds and then stops. Doesn’t play all notes or tries to loop (which it should). If I add the endpoint code bit back in and run it while having a synth app in a background audio mode. Again, it will try to play a few notes and then synth gets stuck on holding down multiple notes at once (quite different from the programmed sequence).
I remember very similar bug in 64-bit iPhones back in 2013, but it was resolved soon after: http://lists.apple.com/archives/coreaudio-api/2013/Oct/msg00012.html
I’m worried that this bug is back. The issue is happening in iOS 9.1 beta 2 also.
If you have a chance to try it out that be great.
Cheers
Niko
In iOS 9 there are still MusicPlayer bugs. I wish Radar was public to track the progress of bug fixing.
http://lists.apple.com/archives/coreaudio-api/2015/Aug/msg00030.html
It plays with my AUGraph and the loop seems to work.
But as you see it doesn’t work in other cases.
I ran it on my 4s. Do you have a more recent device?
Does my project have the same problem on your device?
oh man. That exactly the bug. I’m on the CA mailing list as well, wonder how I missed that.
Sadly, the bug doesn’t seem to be fixed on iOS 9.0.1 or 9.2b2.
To answer your question. I ran your code, in 3 different modes. 1) As is, 2) with no endpoint (i.e. sine wave) 3) Core MIDI sending messages to a background synth app. In all 3 cases they don’t loop and usually get stuck on one of the notes.
I’ve tested it on my iPhone 6, iPad Air 2 and 5th gen iPod Touch. All have the same issue.
It’s interesting that you are not seeing the issue on 4s. I wonder what the difference is between that and iPod Touch 5th gen. They use the same CPU.
I’ll try to post a reply to that CA mailing list post to see if Apple will respond with an update on the issue. As much as I’m up for a challenge of implementing the MIDI sequence player myself I don’t have much time atm.
And thanks for your prompt response.
Just updated to 9.0.1 and tried it again. I get 2 only notes played. Then a freeze.
The good news is that is the same thing you’re getting. The bad thing is that it confirms the bug.
fwiw., The 8.4 simulator works.
The 9.0 simulator works a bit better than the devices, but freezes when you set the loop point.
I’ll ask Doug what the deal is.
Okay, I filed a bug report and sent them a video and source code (before I saw your archive link..) – I put it on Dropbox if you still want to check it out . My issue happened on iPhone 5s with 9.0.1, as well as all simulators running iOS 9.
Hopefully this will be gone in 9.1!
The looping bug is still present in just-released ios 9.1
Hi Gene,
Thank you so much for the comprehensive tutorial! I’ve successfully created a sequence using my own array of notes/velocities, etc via MIDINoteMessage. I was wondering how can I embed some controller message(like CC1 Modulation, CC7 Volume) into the sequence? Is there any structure in AudioToolbox can let me do it easily?
Thanks again.
Rex
I think I sort of find the answer, now what I’m doing is to use MIDIChannelMessage to write the controller data in to the MusicTrack, here’s the code I’ve written to write in a CC1 controller message.
My problem seems solved, but is it the best way to do it?
Thanks!
You beat me to it by a few seconds 🙂
Yes that’s it.
Yes, look at the docs for MusicTrack. There is a function named MusicTrackNewMIDIChannelEvent.
How do you disable the default Synth? I am using the MidiOutput callback to send the note data somewhere else, but I can’t seem to disable the default synth sound that is being generated from the musicplayer(?)
The line that connects the MusicSequence to the AUGraph is this one:
When I do a NoteOn to channel 9 (MIDI Channel 10), I’m expecting a drum sound. However, a melodic sound plays. What can I do to make drum notes play?
Did you send MIDIChannelMessages (bank msb and lsb) to select the bank, then a program change (0xC0 + channel) to select the patch?
Hi, thanks for the tutorial, its has been very useful given the incomplete and messy apple documentation.
I successfully create sequences and put the sampler to put an instrument over the midi, but all channels sounds with the same instrument. I do not know if is possible to put a different sound to each channel in the MIDI.
Thanks in advance