Sunday, February 14, 2016

How do I set up sound effects in iOS (Swift 2.1)? (Or, man! What is it with this digital sound stuff?!)

So, awhile back, I took a digital music programming class for a language called ChucK. OMG! I found out that I know NOTHING about music! (5th - 8th grade clarinet in band, I thought a knew a little - but I was wrong! ;)
Now comes an opportunity to code some sound effects in iOS - and I'm still struggling with the concepts of digital/electronic music. What's a “mixer”? Do I need one? What is an audio node, an audio unit? What exactly is “reverb” and “delay”? Which pieces do I need and how do I put them together to make an “echo” or change the pitch? Well, I still don't know exactly, but I can tell you enough to get some interesting sound effects going in your Swift 2.1 iOS app. Read on...
In ChucK, you “ChucK” stuff to the DAC to make sound come out. Really? It looks something like this:
Gain masterGainLeft => Pan2 panLeft => dac.left;
Gain masterGainRight => Pan2 panRight => dac.right;
Gain masterGainCenter => Pan2 panCenter => dac;
The => is the “ChucK” symbol. In this example, I'm taking variables of type Gain named “masterGain”-something, “ChucKing” them to Pan2 objects named “pan”-something, which are each then getting “ChucKed" to one of the DAC's left, right, and center nodes.
If you want to know what this stuff means, see the footnotes at the bottom. For now, just know that they're all particular facets of an audio ”sound chain”.
I was able to understand enough of the concepts to make music to complete the ChucK course. But, now I needed to do it again in Swift…
As I tried to "gain” (sorry! ;) a better understanding of how the pieces fit together, this video helped:
AVAudio in Practice - WWDC 2014, Session 502
In short, my Swift “sound chain” needed something like this ChucK statement:
input => effect => DAC

To do this in Swift at a most basic level, I followed these steps:
1. Create an AVAudioEngine.
2. Create an AVAudioPlayerNode, and attach it to the audio engine.
3. Set up one or more effects. (Some example effects are: AVAudioUnitTimePitchAVAudioUnitDelay, and AVAudioUnitReverb.) Set any values associated with the effects (such as pitch or ”wet/dry mix"), then attach the effect(s) to the audio engine.
4. Connect the pieces of the sound chain together using the audio engine's connect function, starting with the AVAudioPlayerNode and ending with the AVAudioEngine.outputNode (representing the DAC in iOS).
5. Using the audio player node's scheduleFile function, set the chain up to play.
6. Start the engine.
7. Using the audio player node, play the sound.
Here's what that might look like in code for a pitch change:
/**
    Plays audio at specified pitch.

    - Parameter pitchVal: Pitch at which to play audio.
*/
func playAudioAtPitch(pitchVal:Float) {
    // the audio engine is a global variable,
    //  instantiated in viewDidLoad
    // my resetAudio function stops and resets
    //   the engine before we set up our sound chain
    resetAudio()

    // set up audio player node, and attach it to the engine
    // (the audio player node is also a global variable)
    let audioPlayerNode = AVAudioPlayerNode()
    audioEngine.attachNode(audioPlayerNode)

    // set up pitch effect node, and attach it to the engine
    let pitchEffect = AVAudioUnitTimePitch()
    pitchEffect.pitch = pitchVal
    audioEngine.attachNode(pitchEffect)

    // connect the nodes to each other through
    //  the audio engine to make the sound chain:
    // AVAudioPlayerNode => AVAudioUnitTimePitch
    // AVAudioUnitTimePitch => AVAudioEngine's output node
    audioEngine.connect(audioPlayerNode, to: pitchEffect, format: nil)
    audioEngine.connect(pitchEffect, to: audioEngine.outputNode, format: nil)

    // schedule the recording to play
    //  audioFile is a global AVAudioFile variable,
    //    instantiated in viewDidLoad with a sound file
    audioPlayerNode.scheduleFile(audioFile, atTime: nil, completionHandler: nil)

    // start the engine
    try! audioEngine.start()

    // play the audio
    audioPlayerNode.play()

}
Easy peasy, eh? It plays the sound file with the pitch effect. Here's more detail on how this works...
You’ve got a sound file you want to play with effects. You set up an audio player node to be the “input”, representing your sound file. That input will be put through whatever effects you set up. Then, the whole thing will be put out through the DAC, er, audio engine’s output node. (BTW, the output node points to the default sound output for your device.)
I assume you get pitch. What about “reverb” and “delay”? What are those? You can see a nice GIF and description here that can give you a head start:
Reflection: Echo vs Reverberation
Basically, reverberation is like a much smaller echo. Reverberation is what happens in a room (or singing in your shower ;) while echo is what happens when you yell around a bunch of rock walls, such as in a canyon. This is where “delay” comes in. Take a reverb, add a delay - and you’ve got an echo.
So, we can do this with the code above. We’ve got a pitch node, so let’s add delay and reverb nodes, using the same pattern we used for the pitch effect:
    // set up delay node for echo
    let delayUnit = AVAudioUnitDelay()
    delayUnit.wetDryMix = echoWetDryMix
    audioEngine.attachNode(delayUnit)

    // set up reverb effect node
    let reverbEffect = AVAudioUnitReverb()
    reverbEffect.loadFactoryPreset(.Cathedral)
    reverbEffect.wetDryMix = reverbWetDryMix
    audioEngine.attachNode(reverbEffect)
Er… what is this wetDryMix function? Numerous explanations exist on the web; maybe you’ll be able to find one that makes sense to you (unless you work with Midi equipment or the like and totally get it already!). The value is a Float, representing a percentage of the original sound vs the effect’ed sound. A value of 0.0 will give you no effect, while a value of, say, 2.0 might give you a humongous effect. (Check Apple’s documentation for valid value ranges.)
I set my echoWetDryMix to 0.0 and my reverbWetDryMix to 25.0 to get a lovely cathedral sound. This provided all reverb, no echo. Alternatively, I set my echoWetDryMix to 10.0 and my reverbWetDryMix to 0.0 to get an awesome echo, with no reverb. Experiment with your values to see what interesting things happen!
Now that we’ve added the delay and reverb effects, we need to adjust our sound chain to include them. It should look something like this:

input => pitch => delay => reverb => output
So, in Swift, change your connections section to match your sound chain:
    // connect nodes to each other through audio engine
    audioEngine.connect(audioPlayerNode, to: pitchEffect, format: nil)
    audioEngine.connect(pitchEffect, to: delayUnit, format: nil)
    audioEngine.connect(delayUnit, to: reverbEffect, format: nil)
    audioEngine.connect(reverbEffect, to: audioEngine.outputNode, format: nil)

Everything else stays the same.
What about that “mixer” thing I mentioned at the beginning? Not related to the “wet/dry mix”, a “mixer” comes into play when you want to combine effects in special ways - from multiple sources, say. Like the outputNode property, the audio engine has a default mainMixerNode to help with this. The WWDC video can give you great info on it if you’re ready to go to that level. But, it works the same way: the mixer just gets added to the sound chain, along with the effects, as appropriate based on how you’re using it. I don't need a mixer, so I'll leave it at that for now.
Feel free to play with this stuff. Changing the order of the chain can sometimes affect the effect. Changing other values within an effect (such as delayTime within the AVAudioDelayUnit object) can create interesting sounds.
Use the free documentation on ChucK => Strongly-timed, On-the-fly Music Programming Language, or check out the book Programming for Musicians and Digital Artists - even if you’re not interested in learning the language of ChucK, you’ll likely find the discussions on the workings of digital music to be helpful in your sound programming efforts.
This has been a very basic level discussion of how to set up some sound effects in iOS using Swift. I know very little about digital music, but I hope I have given you a great start to expanding your own knowledge way past mine! Enjoy!
Footnotes:
ChucK => Strongly-timed, On-the-fly Music Programming Language - free, open source; includes Cocoa and iOS source code that makes up the underpinnings of ChucK. Note: some of the developers of ChucK have a company called Smule - makers of several popular music and sound-related apps on the App Store. See Dr. Ge’s TED talk linked on the ChucK site.
Programming for Musicians and Digital Artists - text for learning to program in ChucK, including discussions on the workings of digital music. (Even if you don’t care to learn ChucK, the digital music explanations can be very helpful!)
DAC:
Digital to Analog Converter - it represents the speakers, headset, or whatever outputs the sound on your computer.
Gain:
I'm probably not going to explain this right, but I'll try based on my understanding: gain is to audio output as bandwidth is to an Internet connection. On the Internet connection, if multiple people are downloading files, one might take up the whole bandwidth while the others have to wait their turn. Alternatively, the network might be set so all of them can download at the same time, but each can only use a fraction of the bandwidth.

Gain, then, is the amount of audio pipeline the sound is allowed to take up. In my ChucK example, I'm splitting my sound output up between 3 panning objects, so each has 1/3 of the sound “bandwidth”. (Note that volume is separate from gain: you could have something at full volume, yet it'll still sound quieter if it only has a fraction of the gain. This is similar to how you can still do a complete download with only part of your Internet bandwidth - the download will just be slower.)
Pan:
Panning only works in stereo. It controls where you hear the sound: left speaker, right speaker, or both, as well as combinations inbetween. As an example, I really like songs or videos that have a car racing by - you hear the car come in from the left, say, then it seems to pass in front of you, then it leaves to the right. This is an awesome use of panning! Listen to this Car Passing By Sound Effect video.


Monday, February 8, 2016

Removing sound files from iPhone simulator during testing (Swift 2.1)

I was playing around with a recording app. During my tests, though, I realized the saved files could end up taking a bunch of space. So, I went looking for a way to handle this issue. I found two ways.

1) I could use a single, static filename and leave it in place. Then, each recording would overwrite any previous recording. Of course, this would leave the last file hanging out there when the app was done running.
2) I could somehow delete the file(s) when the app terminated.
I decided to go with the second option as a learning experience. In case you feel this is TL;DR, here’s my end code. I called the code in the View Controller’s applicationWillTerminate function. Further code explanation will follow this snippet.
func removeWavFiles() {
    // get directories our app is allowed to use
    let pathURLs = NSFileManager.defaultManager().URLsForDirectory(NSSearchPathDirectory.DocumentDirectory, inDomains: .UserDomainMask)

    // if no directories were found, exit the function
    if pathURLs.count == 0 {
        return
    }

    // we found directories, so grab the first one
    let dirURL = pathURLs[0]

    // try to retrieve the contents of the directory
    // if the results are nil, we’ll exit; otherwise we can continue
    //    without implicitly unwrapping our variable (due to guard)
    guard let fileArray = try? NSFileManager.defaultManager().contentsOfDirectoryAtURL(dirURL, includingPropertiesForKeys: nil, options: .SkipsHiddenFiles) else {
        return
    }

    // create a string to describe our sound file’s extension
    let predicate = NSPredicate(format: "SELF MATCHES[c] %@", "wav")

    // use our file list array to locate file paths with the right extension
    let wavOnlyFileArray = fileArray.filter { predicate.evaluateWithObject($0.pathExtension) }

    // iterate through each path in our new wav-only file list
    for aWavURL in wavOnlyFileArray {
        do {
            // try removing the current item; if we get an error, catch it
            try NSFileManager.defaultManager().removeItemAtURL(aWavURL)
        }
        catch {
            // we caught an error, so do something about it
            print("Unable to remove file: \(aWavURL)")
        }
    }
} // end removeWavFiles()
To start, I had to find where the saved file was located. I had saved the file using this code:
let dirPath = NSSearchPathForDirectoriesInDomains(.DocumentDirectory, .UserDomainMask, true)[0] as String
To find the OS X directory this pointed to from the Simulator, I ran this code:
let pathURLs = NSFileManager.defaultManager().URLsForDirectory(NSSearchPathDirectory.DocumentDirectory, inDomains: .UserDomainMask)

if pathURLs.count > 0 {
    let dirURL = pathURLs[0]
    print(dirURL)
}
This resulted in a string something like:
file:///Users/myName/Library/Developer/CoreSimulator/Devices/5BE7BEE7-1AA7-4BB8-BAE3-7E444441D4F3/data/Containers/Data/Application/0D401AA5-19CC-4F50-B956-34F89D671FA9/Documents/
Using Terminal, I traversed my way to that directory. Sure enough, the file was there.
Next, I needed to find out how to retrieve the filename(s) from that directory. Using Apple’s documentation, I found that this code would give me an array of NSURL objects representing the contents of a directory:
let fileArray = try? NSFileManager.defaultManager().contentsOfDirectoryAtURL(dirURL, includingPropertiesForKeys: nil, options: .SkipsHiddenFiles)
  • The try? is necessary because the results could be nil or NSFileManager.defaultManager().contentsOfDirectoryAtURLcould otherwise throw an error. This is indicated in Apple’s documentation by the keyword throws:
func contentsOfDirectoryAtURL(_ url: NSURL,
   includingPropertiesForKeys keys: [String]?,
                      options mask: NSDirectoryEnumerationOptions) throws -> [NSURL]
  • Note: in the end code, I wrapped this statement with a guard/else statement. Since there’s no point in continuing if no files are present, this way the function can simply return if there’s an error. If files are found, though, I then don’t need to implicitly unwrap the array (no ! needed when I use the array).
  • The first parameter of contentsOfDirectoryAtURL is the path to search. That was my dirURL variable.
  • Next, contentsOfDirectoryAtURL wants to know if I’m looking for specific properties, such as file permissions. I didn’t need to specify any properties, so I just put nil there.
  • There are multiple NSDirectoryEnumerationOptions values; however, the only one that works in the .contentsOfDirectoryAtURL function is .SkipsHiddenFiles. My file was not hidden, so I went with that option. (If I needed to include hidden files, I’d just put nilthere.)
Okay, so now I have a list of files in the directory, but… how do I find exactly the file(s) I want???
Well, I learned a bit about NSPredicate. An NSPredicate object lets me specify a string to search on in an array’s contents. I found a helpful cheatsheet for NSPredicate here:
A handy guide to using NSPredicates
I had named my file with a wav extension, so I wanted to search for that. So, here’s my predicate:
let predicate = NSPredicate(format: "SELF MATCHES[c] %@", "wav")
  • SELF represents the object being searched.
  • MATCHES means to match the item exactly
  • [c] indicates the search should be case-insensitive
  • %@ is Objective-C’s replaceable parameter notation for an object (such as a string)
  • ”wav” is the string that will replace the object parameter (%@) in the format string
To test the value of my format string, I used the predicateFormatfunction that NSPredicate provides:
print(predicate.predicateFormat)
The string comes out as:
SELF MATCHES[c] “wav”
At this point, I had the fileArray variable. Now, I needed to create a new array to hold the results from using the predicate:
let wavOnlyFileArray = fileArray.filter { predicate.evaluateWithObject($0.pathExtension) }
  • No ? is needed anywhere here because the original array won’t be empty. The guard statement I used when creating fileArray ensured that for me.
  • Arrays in Swift have a filter function. It lets you use a “closure” (an unnamed function) to process the array in some way. This is where I used the predicate. (Note that the closure needs to be in curly brackets after the filter function.)
  • The predicate’s evaluateWithObject function, because it’s in the closure, will be run on each item in the array. The $0represents the current item.
  • NSURL objects have a pathExtension property. As a result, I can simply append that property, using dot notation, to the $0 - and I’ll get a string representing the current path’s extension as we loop (iterate) through the original array.
  • The final array, wavOnlyFileArray, will contain only the path URLs with a wav extension!
Now we can delete those items! 
for aWavURL in wavOnlyFileArray {
    do {
        try NSFileManager.defaultManager().removeItemAtURL(aWavURL)
    }
    catch {
        print("Unable to remove file: \(aWavURL)")
    }
}
  • This for loop doesn’t need anything special in the way of optionals handling. If wavOnlyFileArray ends up being nil, that’s okay - the code simply won’t iterate.
  • I’m not setting or working with any variables that might result in optional variable handling. Therefore, a guard or if letstatement wouldn’t really be applicable here.
  • NSFileManager’s removeItemAtURL function, though, throwswhen an error occurs. So, instead I used a do - try - catchstatement to handle the throws possibility.
  • The code in the do portion will run happily along as long as there’s no error.
  • I’m trying the removeItemAtURL call because it might throwan error.
  • If an error does occur, then the code will move to the catch. I can respond to the situation there. Otherwise, the code will just end normally.
Note: I used the URL options for handling file paths, rather than the string-based path versions. Apple indicates their preference in the NSFileManager Class Reference:
When specifying the location of files, you can use either NSURL or NSString objects. The use of the NSURL class is generally preferred for specifying file-system items because they can convert path information to a more efficient representation internally.
Where I am aware, I always try to use Apple’s preferred methods. Often, I find their preferred methods are easier, offer more clarity in code reading, and have additional benefits. For example, as you can see from the NSFileManager documentation:
You can also obtain a bookmark from an NSURL object, which is similar to an alias and offers a more sure way of locating the file or directory later.

Tuesday, February 2, 2016

Swift: Optionals?! (and IBOutlet)

I'm trying to understand Swift's concept of “optionals”. The idea is that Swift won't allow a developer to purposely set something to nil. (For simplicity, think of nil as another word for null.)
If you know that something might end up being nil (through no fault of your own, of course), you should declare the variable as an “optional” by using a question mark after the type declaration, like this:
var myInt: Int?
If you're certain that something will have a value, but you haven't given it one yet, you tell Swift using an exclamation mark:
var myInt: Int!
But, then I noticed that Xcode inserts the following when connecting a label in a view to its controller:
@IBOutlet weak var recordingLabel: UILabel!
WHAT?!
weak means that the label might not be there. It could potentially be nil.
Yet, Xcode is giving the label a !, meaning that it’s certain to have a value.
How can that be???
Well...
@IBOutlet weak var recordingLabel: UILabel!
1. -- Xcode places weak here because the View owns the label. Therefore, if the View goes away, the label should be able to go away with the View.
 -- If strong were used here, the label might still be hanging out in memory after its View is gone.
 -- I can make an assumption that my code associated with the View wouldn't be running anyway if its View weren't there. So, I don't need to force the label to stay with a strong.
2. Since I can make the assumption that my View will be present while my View Controller is running, then I can use the ! to “implicitly unwrap” the label. If the View is there, then my label will be there, too.
3. Finally, the ! is necessary to tell the compiler that I, as a developer, know the label will be there. The compiler is going to trust my judgement in this case. If the label isn't there at runtime for some reason, though, then I'll get an error.