The smartphone has been around for less than ten years, yet by August 2015, two thirds of the UK population were using one on a daily basis [1]. The smartphone has revolutionised how people process and receive information and international usage produces sensational statistics such as, some 60,000 photos are shared globally every second, amounting 5 billion photos a day [2], gobbling up storage space.
Across internet access, emails, music streaming, social media, storage of precious memories in the form of photos and videos, playing games and even making telephone calls, user requirements of the smartphone are multiple and varied, blurring the boundary between work and play and creating a huge supply and demand for better user experience.
As an interdisciplinary sound designer, interested in the use of smartphone technology for game driven dynamic audio experiences, I am aware that the smartphone has yet to realise it’s full potential in its uptake of converging technologies. In my case, I first started using Unity around two years ago. I love how easy it is to build for multiple platforms and the potential it provides for rapid prototyping. It’s the same feeling I got when I was first introduced to Pure Data: whose visual approach allows for quick prototyping of creative dynamic audio. Having devoured Peter Brinkmann’s book, Making Musical Apps, this led me to the use of LibPd with Xcode.
While the processing power of mobile technologies is ever improving, the problem of device storage capacity still raises problems. Often, when when exhibiting my mobile game Hedra, when people go to download it, they discover they have no space left available on their device (despite the game’s meagre 30mb size). As a developer, I am keen to keep the size of my apps to a minimum and this leads to my interest in Pure Data.
When there are already excellent audio tools for game audio such as Wwise or FMOD, why should I want to go using Pure Data? Normally when setting out to achieve a sense of variation and dynamism using sampled audio, large amounts of sound files must be packaged up with the game, thereby increasing its size. This is where Pure Data shines, allowing for the procedural generation of sound without the demand for storage. In addition, there is no graphical interface which could shape the user’s choice of interaction design, creating the possibility for much more unique potential sound design. For some great examples of procedural audio created using Pure Data Andy Farnell’s work serves as a great starting point.
I came across LibPd4unity which allowed for the fairly straightforward integration of Pure Data in Unity games for OS X and Android, and it worked quite nicely but alas, it did not work on iOS. So I was off on the search for something that would do the job of integrating Pure Data in Unity for iOS. The missing link : Heavy, a cloud-based service which compiles Pure Data patches to portable C code, javascript and a range of binaries.
In this tutorial I aim to show how to set up a simple oscillator in Pure Data which can be controlled from an iOS app built in Unity using Enzien Audio’s Heavy. Heavy is an online tool which generates high-performance code from Pure Data patches.
Building a Pure Data patch and using the Heavy compiler
- Build a patch! Take special care to only use objects supported by Heavy. Their list is still growing so for the most up to date object support refer to the list of supported objects
- To expose receive objects within Unity they must be annotated with
@hv_param
. Internal receives that don’t require controlling from Unity will function as normal without this extra annotation - Specify the receive’s range, within the receive object, as follows:
[r frequency @hv_param 200 2000 400]
where 200 is the minimum, 2000 is the maximum and 400 is the default initialisation value
- Using Unity’s AudioMixer to host the generated plugin requires that an
adc~
object be included in the patch even if there is no audio input from Unity. Here is an example of a patch set up to work with this method
Compiling using Heavy
- Go to Enzien Audio’s website, register if you haven’t already done so.
- Go to “My Patches”
- Enter a name for your patch and click + New Patch
- Choose the Pure Data patch that you wish to compile, click Compile.
- You should now see a list of generated files ready to download you can also test functionality using the webplayer, you will be using the Unity OS X and source files
Building a Static Library in Xcode.
In this step you will compile a static library to allow the audio plugin to work on iOS.
- Download the generated Unity OS X files and unzip
- Download the Unity Source files and unzip
- Open the plugin Xcode project located in the Xcode folder within the heavy source files you just dowloaded
- Go to Editor > Add Target
- Select iOS > Framework & Library > Cocoa Touch Static Library
- Name the product
- In the project view set Build Settings > Architectures > Build Active Architecture Only to NO
- Set Valid Architectures to arm64 armv7 armv7s armv6
- Set the Active Scheme to the static library just created and the destination to Generic iOS Device
- Press the play button to build the library
- Find the generated files in Library > Developer > Xcode > DerivedData > Hv_Simplesynth_plugin-(A random bunch of letters) > Build > Products > Debug-iphoneos
- Copy the .a file and the .h file from within include > Hv_Simplesynth_plugin into Assets > Plugins > iOS once you have setup your Unity project.
Setting up Unity
- Head over to Unity, you will need Unity 5 as this method depends on its Audio Mixer feature
- Create a new project
- Create a new folder called Plugins in the Assets folder and within it create a new folder called iOS
- Copy the .h and .a files generated by XCode earlier into the new iOS folder
- Head back to the original source files downloaded from Heavy go to the folder named source and drag its contents into the iOS folder in Unity (Assets > Plugins > iOS)
- In the Unity OS X files downloaded from Heavy copy AudioPlugin_Hv_(X).bundle to the Assets folder
- Open the AudioMixer Window by going to Window > AudioMixer
- Add a new Mixer, (it can be named anything)
- Go to the Master channel and click Add, select your Audio Plugin from the bottom of the list, it should now appear in the inspector window where you can alter the parameters you have exposed
- Create a new game object and add an Audio Source component
- On the Audio Source set the Output to the mixer channel that’s just been set up, it will be listed under the name of your mixer as “Master”
Expose the plugin parameters of your Heavy plugin within the newly created Audio Mixer to allow for runtime manipulation. Unity have made a handy video explaining how this is done:
Building the app, some tweaks and running in Xcode.
- Build your project for iOS, generating an XCode project
- Before running the project a few changes must be made in order to link the AudioPlugins with the generated project
- Open the generated Xcode project
- Navigate to Unity-iPhone > Classes and import “AudioPluginInterface.h” to UnityAppController.h
- To the function
preStartUnity{}
in UnityAppController.mm add the line:
UnityRegisterAudioPlugin(&UnityGetAudioEffectDefinitions);
- Run the app on your device
Huzzah!
I would like to thank Martin and Joe from Enzienaudio for their help and advice on a couple of details that helped find some missing links. I would also like to acknowledge the work detailed on this thread on the Unity forum.
Sources
[1] http://www.deloitte.co.uk/mobileuk/assets/pdf/Deloitte-Mobile-Consumer-2015.pdf
[3] http://forum.unity3d.com/threads/native-audio-plugin-bundle-import-settings-for-ios-build.315590/