Ambisonic audio workflows for 360 films and VR

This is a living document, being constantly updated - email me if you think I’ve missed anything out or got something wrong! - last updated 31st Jan 2024. If you’ve got anything that should usefully be on here, please let me know. I have no affiliation with these products or manufacturers.


There are a number of systems available for mixing sound for use in a 360 film, or other VR-style style environment. Many of these were developed in the late 2010’s alongside the second wave of VR. This page is intended as a review of the available systems, and looking at some of the bugs and issues with them. Be aware, I may not always quickly update this page so software updates may occur which resolve issues on here. I’ve included the software release versions I’m referencing below so you can check the release history to see if issues have been solved or not.

If you’re new to these workflows, they can be a bit complex to get your head around. Typically they consist of a few components:

  • A panner plug-in. This will take either a mono, stereo or ambisonic sound, and allow you to pan it around in space. It’s typically inserted into each audio channel you want to spatialise.

  • A video player. This lets you load in a 360 video and locate/pan your sounds to objects in the video. Additionally, this may also allow you to create a viewport onto the 360 video (where you only see a section of the 360degree film), and can then move your mouse around to move the viewport; or in a VR headset and you can move your head around. In both situations you’ll hear the sound being played off your DAW change in response to these movements so you can audition what it will sound like to the end user. Currently the Oculus Quest is the most popular headset at the mo, so I’ll focus on how this can be used.

  • A master plug-in. This often co-ordinates common action between video player, viewport tracking and everything. It will often provide a method to audition the output in binaural sound over headphones, before you bounce it out as a multi-track ambisonic file.

  • Encoder application. This will take the audio file that you bounce out of your DAW, and combine it with the move file, inject metadata about the audio tracks and provide a file that is ready to upload. Different platforms (YouTube, Facebook, Plex, etc) have different requirements for the uploaded files.

There are some other “optional” extras you will need to mix for ambisonics: compressors, limiters, loudness monitors, reverbs, etc, some of which the below systems provide, and some which do not. But there are plenty of other means to get these extra elements. Check out the IEM plug-in suite (free) and the 03A Core plug-ins (also free), plus an increasing array of commercial plug-ins. There’s a great list of freeware plug-in’s here. As well as these systems, you can also often mix and match components from different manufacturers, and there are lots of individual components (such as panners and various other ambisonic spatialisation tools) out there. What is key to many of these systems (and why I’m not looking at individual components and plug-ins) is there integration with a video playback system.

Virtually all the systems below are compatible with Reaper, Nuendo, Pro Tools Ultimate (not Standard), etc. Note, Logic is not currently compatible with ambisonics beyond first-order ambisonics due to its limited output busses, so none of these systems work effectively on it. As Apple are increasingly embracing Spatial Audio for their AirPods, Vision Pro, one might imagine that Logic Pro would be at the heart of the spatial audio workflow, and even that Safari might support Spatial Audio…. Whilst you can do some basic Dolby Atmos stuff, they’ve implemented this via a massive software workaround, rather than re-write the software for it to do spatial audio properly.

If you’re getting started with all this, Reaper is the best place to start as there are a lot of tutorials online for using Reaper, and you can use the Evaluation license to get to know the software before committing to it financially. I used Reaper initially but am in the process of moving over to Nuendo. I can’t recommend ProTools Ultimate due to its cost.


Facebook Spatial Workstation

(aka Audio 360, Facebook 360, FB360, Two Big Ears and TBE!)

Link: https://facebook360.fb.com/spatial-workstation/

User updated build: http://www.angelofarina.it/Public/FB360/Mac-new-2023

Cost: Free

In active development: No. Last updated in May 2020 and in May 2022 Facebook removed links to download the software. There seem to be problems installing the software on Apple silicone computers (M1, etc). Apparently, the separate Encoder application will remain in development. In 2023, a release (made by a user) that updates some of the components used by FB360 to work with Apple silicone and extends the life of the software considerably

Issues: See Video Player FFmpeg issue below.

Overview: This was a very popular system, both because it was free, and because it’s a very mature, well thought out piece of software. It has a very complete set of components: panner, loudness monitor, video player, and encoder. It seems through that Facebook are not providing software updates or downloads for it anymore so it is impossible to recommend it anymore.

Panner: The panner system is great. You can load your 360 video in to the panner and place sounds on to the video very easily (compared to other systems where the video sits on the video player window, and you have to move a puck around the panner window). It can track objects, which I’ve not tested. You can disable modules like directivity and room acoustics to preserve CPU power. It can deal with mono, stereo or ambisonic inputs.

FB360 screenshot.png


Video Player & Audtioning: FB360 uses it’s own bespoke Video Player, rather than trying to use your DAW’s built-in video player (as Audio Ease do). The video player also allows you to connect to a VR headset to audition your sound in there. It requires you to convert any movie files in to DNxLB format, which is time-consuming and makes for very big movie files. The manual is a bit out of date here which doesn’t help matters. The Mac software for converting files is FF-Works. Do not try and use h264 files as the timings will drift out and you’ll find yourself trying to sync audio to the wrong frames. The video player can also be hosted on a 2nd computer, linked together over a network, so you can maximise your computer’s CPU power for the audio, and not for playing back a large video file - see below for instructions.

The video player uses an external piece of software called FFmpeg to work. The Spatial Workstation software currently (v3.3.3 and below) fails to install this, as the FFmpeg version used doesn’t exist anymore. This bug has been in existence for over a year without being fixed. Fortunately, a helpful user, Tormy Van Cool, has produced a downloadable version of FFmpeg which works well, available here.

Using FB360 Spatial Workstation on an Oculus Quest : This can be done, on a PC by using Oculus Link. It’s a bit fiddly and the manual doesn’t cover this.
1) Connect your Quest to your PC (or Mac running Bootcamp) (use a cable like this one). Power Quest up, and DENY access to your computer. Go to Settings App on the Quest headset and select Oculus Link. The Oculus app will open on your PC - you can just minimise that.
2) Open one of the FB360 panner plug-ins in your DAW, and click the “Breakout Video” button in the top right (this opens up the video selected in the panner using FB360 Video Player)
3) In FB360 Video Player, click the VR icon on the right hand side
4) FB360 VideoClient will open with a viewpoint. Steam VR may also open, you can quit that altogether. As the VideoClient opens both Oculus and SteamVR, sometimes SteamVR will take over the Quest and you won’t be able to see your video - Quit SteamVR.to leave Oculus Link in control.
5) Go back to FB360 Video Player, and where it says Mouse on the top left, click on that drop down box and select your computer/DAW name. This allows VideoClient to control Video Player, which in turn controls the FB360 Control plug-in.
6) Ensure that Get From Video is turned on, in the FB360 Control plug-in (on one of your group or output busses)
At this point you should be able to move your headset around, and see FB360 responding to that.
7) Route your DAW audio to the Oculus Quest so that you are hearing it in sync, and mindful of the Quest’s limited headphone amp volume ;-)

Sadly the Quest auditioning doesn’t allow you to do anything other than watch, spin and listen, but that’s very useful nonetheless.
We can all dream that in a later version this might feature DAW transport control and the ability to move some of the panner pucks around as DearVR’s Spacial Connect allows, but considering Spatial Workstation is free, it’s a very impressive package.

Mac / PC: The Mac and PC versions are almost identical. The Mac version doesn’t allow you to use a VR headset directly. You can however use the FB360 Video Player on a PC, linked to a headset, being remote controlled by FB360 running in your DAW on a Mac. This is a very handy way to split the CPU load between playing a large video file, and everything your DAW is doing. To set this up, assuming you already have FB360 running on your Mac in your DAW:
1) Connect your Quest to your PC (or Mac running Bootcamp) using a cable like this one. Power Quest up, and allow access to computer. Go to Settings App on the Quest headset and select Oculus Link.
2) Open the FB360 Video Player application on your PC. This is a standalone application, not a plug-in in a DAW, Select “Remote”, load your video, set “Select Display Format” to VR, and most importantly select your Mac Laptop in the “Connect to DAW Clock” section. Then click Open.
3) FB360 VideoClient will open on your PC with a viewpoint. Oculus app will open on your PC - you can just minimise that. Steam VR may also open, you can quit that altogether. As the VideoClient opens both Oculus and SteamVR, sometimes SteamVR will take over the Quest and you won’t be able to see your video - Quit SteamVR.to leave Oculus Link in control.
4) Open one of the FB360 panner plug-ins in your Mac DAW, and click the “Breakout Video” button in the top right (this opens up the video selected in the panner using FB360 Video Player)
5) On FB360 Video Player on your Mac, where it says Mouse on the top left, click on that drop down box and select your PC computer/DAW name. This allows VideoClient on the PC to control Video Player on the Mac, which in turn controls the FB360 Control plug-in.
6) Ensure that Get From Video is turned on, in the FB360 Control plug-in (on one of your group or output busses)
At this point you should be able to move your headset around, and see FB360 responding to that.
7) You should now be hearing audio from your DAW, and seeing the video in your headset. You won’t be able to route the audio straight to the headset in this setup.


Encoder.
The FB360 application is a great way to prepare video files for distribution. It will combine the spatial audio files, the head-locked stereo files and the movie files together to produce an upload ready file. It can prepare videos for Facebook, and in doing so these files can also be played back directly in the Oculus Gallery application. The latter is the easiest way to distribute files for testing, as you can send a Dropbox link of the encoded file, and the person at the other end can use Android File Transfer to load it on to their quest and play it back in full-resolution on their Quest using Oculus Gallery. Currently (v3.3.3) you cannot use the Encoder app to upload files for YouTube (which requires a First Order AmbiX file and a head-locked Stereo file ) - the encoder allows you to select everything but then throws up an error when you go to click Encode that YouTube does not support Spatial Audio plus a Head Locked Stereo track, which it has since February 2018! Instead you will need to follow the instructions here to combines the files and inject them to a movie using Spatial Movie Injector.

Manuals and Template sessions: The manual is pretty comprehensive, though was clearly written a fair few years ago and hasn’t been systematically updated with the software, a common problem. The DAW templates are really good though. Manual [here]. The DAW templates are automatically downloaded with the software,


DearVR

Link: https://www.dear-reality.com

Cost: £££ but often on sale, especially via www.plugin-alliance.com. Note, whilst DearVR Pro is often on sale, Spatial Connect is rarely on sale.

In active development: Yes. DearVR is a subsidiary of Sennheiser.

Issues: Nothing of significance

Overview: DearVR offer a range of plug-ins and applications that can be used and purchased separately, with some components available for free, DearVR Pro is their panning software, but also includes a range of reverbs and acoustic simulations. dearVR Ambi Micro is their free plug-in which can be used to convert an ambisonic mix to binaural for auditioning. Or you can use DearVR Monitor, which allows the use of a head-tracker. dearVR Spatial Connect allows in-headset auditioning (and adjust parameters of the DearVR Pro’s panners so you can adjust the mix in VR, rather than having to peer out at your DAW in the gap below the bottom of the headset and your nose). With their link to Sennehsier and their Ambeo ambisonic mic, many of the plug-ins are optimised to easily bring in recordings from an Ambeo ambisonic mic, which is handy. But it is expensive, with a bundle of DearVR Pro and Spatial Connect currently costing $648.

Panner: The Dear VR Pro panner offer a lot of options. Obviously you can pan the sound around. You can also adjust occlusion for the sound source (simulating putting a wall between you and the sound source) which is good for treating close-mic’d recordings. (TDR Lab’s Proximity plug-in is also very useful for this, sadly discontinued but still working for now). You can also add reflections, for example to simulate the sound source being adjacent to a reflective wall. There are also a range of reverbs that can be added, from car interiors, to churches.

There is no way to use the panner on its own to place the panning of the sound to the image - you can’t load up a 360 movie on your DAW and mouse around it and hear the sound change. You ned to have the Spatial Connect software to do this, which is both expensive and PC-only. If you can’t afford this, you can use Ambi Micro as a ambisonic to binaural decoder, manually adjusting the yaw and pitch controls and work it out by ear, but it is a very limiting feature.

Video Player & Audtioning: (Caveat: whilst I own most of DearVR’s plug-ins I don’t own Spatial Connect). Spatial Connect works as the video player component. Similar to FB360’s Video Player, Spatial Connect can run on the same machine as your DAW, or it can be on a separate computer that’s networked to your DAW computer, which allows the CPU load to be split across two computers. Two things of note here, whilst all of DearVR’s software is Mac and PC compatible, Spatial Connect is only PC compatible - this is more to do with VR headset manufacturers only making their link software for PC rather than a fault of DearVR. If you have a two computer set-up, Spatial Connect can run on a PC, and communicate with your DAW on a Mac or PC. The second thing is that it will only communicate with Reaper or Nuendo DAW’s (i.e. not compatible with ProTools). But here’s the good bit, if you’ve got all of this up and running, not only do you have a 360 video player with a movable viewport on your computer monitor, you now also have an in-headset editing and mixing experience that no other system offers. You have transport and metering within the headset, and can use hand controllers to grab imaging pucks and move them around to match your 360 video. I’ve played around with a demo of this and it’s pretty awesome. With the other systems here, automating the movement of an object requires automating multiple parameters on the panner element. With Spatial Connect you can put it in to automation write mode and drag the puck around the video with your hand, which really save a lot of time.

A handy additional feature is that once you’ve built up your mix and spatialisation in your DAW using the plug-ins, Spatial Connect can then all your sources, settings & automation straight into Unity. It generates prefab objects that contain the sound with animation components with all your DAW automation included. Your Unity project will need the DearVR Unity audio engine to utilise this. I’ve had a quick play with the Unity plug-in and whilst it is pretty good, the version I tried didn’t really handle occlusion of objects at all.

Spatial Connect works in conjunction with the dearVR Ambi Mico plug-in, which receives the head tracking data from the headset and creates a binaural headphone output to match the headset orientation.

DearVR recommend that you convert movie files for optimum use. The manual links you to a version of FFMPEG which doesn’t exist any more.


Encoding
: There is no Encoder software. There isn’t really any guidance provided on how to export your mix for upload to YouTube, etc. However you would use FB360 Encoder or Spatial Media Metadata Injector tools - see AudioEase’s 360pan suite page, the “Export to YouTube” and beyond sections for info on this.

Mac vs PC: See above re Spatial Connect

Quest compatible: Yes ( I’ve not tested it), with Spatial Connect and Oculus Link

Manuals and Template sessions: Manuals are here.


360pan suite

Link: https://www.audioease.com/360/

Cost: ££

In active development: Yes. but Audio Ease provide infrequent updates on all their products

Issues:

Overview: This system consists of a few components. The 360pan panner, the 360 monitor video player, 360reverb, 360 limiter, 360radar, and 360 turner. 360radar overlays a visual loudness meter on your 360 video which is handy. Crucially though it does not allow you to audition your mix using a VR headset. You can purchase a relatively cheap head-tracker to go on your own headphones to control the rotation of the video player, but it’s just not the same as being in the headset, where the scale of everything changes so much. What sets it apart from other systems is the 360reverb, which provides an ambisonic version of Audio Ease’s renown Altiverb. The 360limiter is also a useful tool.

Panner: The panner is relatively cut back, compared to FB360 Spatial Workstation, with a simple interface. Irritatingly, you can’t load your video in to the panner window. Whilst the panner has a generic 3D display on it, you can move the imaging puck around more easily on your DAW’s Video Player window. The panner plug-in also contains an Aux send to the 360reverb plug-in so you can use that as a send-return effect rather than having to insert it on every channel you want it on.

You can use the position blur paramenter to set audio that shouldn’t move (head-locked) even if you are working in a format that doesn’t support head-locked stereo tracks.

The panner can deal with mono and stereo input sources. For a stereo input source you get two puck that you can position in space.

You move the (yellow) puck around on the panner, on the left side, but have to watch its position in the video player window, centre, to ensure it lines up with the desired source in the video. Or you can move it around on the Video Player window itself, and edit a couple of the parameters of the panner there. It’s a bit clunky and takes up a lot of unnecessary screen space. 360monitor on the right allows you to adjust what perspective you’re listening to from.

You move the (yellow) puck around on the panner, on the left side, but have to watch its position in the video player window, centre, to ensure it lines up with the desired source in the video. Or you can move it around on the Video Player window itself, and edit a couple of the parameters of the panner there. It’s a bit clunky and takes up a lot of unnecessary screen space. 360monitor on the right allows you to adjust what perspective you’re listening to from.

Video Player & Audtioning: 360pan suite uses your DAW’s built-in video player to display the whole panorama of the video. It then hacks a grid overlay over the top of that, and overlays each 360panner puck over it too. It’s not pretty but it works. Alongside this you would use the 360monitor plug-in, to pan around the video. The 360 monitor provides ambisonic to binaural conversion based on where you’re looking. You can use the mouse to look around, or buy a 48 Euro bluetooth head tracking unit which attaches to your headphones, and will rotate 360 monitor as you rotate your head. Whilst this is fine if you’re working on something that is only really a 180 degree video, an inherent flaw becomes obvious when you’re working with 360 videos: If you rotate your head 180 degrees, you can hear the sound rotate, but you’ll no longer be able to see your monitor to watch the 360 film! That said you can mix and match components of a system, so for example use 360panner and 360reverb, which output in AmbiX format, but send it to the FB360 Spatial Workstation master plug-ins (set to AmbiX format) and use that to monitor in-headset. Compared to FB360, which requires you to convert your video file to DNxLB format, here you can just use whatever format is compatible with your DAW, saving some time.

360reverb: The 360reverb unit is one of the main reasons why I bought and use the 360pan suite. It has over 60 IR’s from a range of indoor and outdoor spaces, and this can really help place voices and sounds into space.

360radar: This also deserves an honourable mention. It is very useful if you’ve used a 360 camera and want to ensure that you have matched up the alignment of the camera and the ambisonic microphone. I have also used it in an acoustics setting to try and identify where some unusual echoes were coming from in the space.

Encoding: There is no Encoder software. They provide guidance on how to export your mix and use FB360 or Spatial Media Metadata Injector tool to create a file.

Mac vs PC: Works equally well on both.

Quest compatible: No, not unless you use FB360 as well.

Manuals and Template sessions: There is a useful video outlining a workflow in PT here. There are a bunch of manuals, templates, example projects and lots of useful info and utilities here.


Noisemakers AmbiBundle HD (AmbiPan HD, AmbiHead HD and AmbiEyes)

Link: www.noisemakers.fr/

Cost: £

In active development: Yes

Issues: Unknown.

Overview: I will start off by saying I’ve not used this system.( I’ve used AmbiHead in the past which is really good) so most of this is derived from their website than actual usage. The AmbiBundle system consists of the AmbiPan HD panner, the Ambi Head HD ambisonic to binaural decoder, and the Ambi Eyes 360 Video Player. The latter allows auditioning from a VR headset like the Quest, using Oculus Link. This system looks pretty neat, but isn’t documented very well either on the website or in their manual, so it’s easy to overlook. The AmbiVerb and AmbiLimiter both look pretty good. It’s relatively cheap too.

Panner: The panner seems to be relatively fuss-free in it’s layout. It can accept, stereo, mono, quad, 5.1, 7.1 and octagonal input signals. The manual says that it only outputs sound in 1st order ambisonics, but actually it works up to 3rd order. It works in conjunction with a little app called AmbiScene which creates a floating grid which you can manually overlay on your DAWs build;t-in video player to place the imaging pucks in the correct place. It’s slightly cumbersome - if you want to reposition and resize your video player, you also have to do the same to the Ambi Scene window. But it works. You can also have the plug-in simulate air absorption loss over distance, which is really nice touch, though I’d prefer to have a bit more customisation over that.


Video Player & Audtioning:
AmbiHead HD is used to create a binaural version to listen to. AmbiEyes is a video player for monoscopic-only 360 videos (in H264, H265 and VP8 formats only), which allows you to mouse around a viewport on to your 360 video. The website states it’s compatible with VR headsets, including Quest via Link. I’m unsure whether it just proves headset orientation data or if you can also see the 360 video in the headset - there is no manual for this part of the software. It looks like it would work over a network to a second computer for remote playback of the video. It’s available for both Mac and PC, but the Quest use is PC-only.

Encoding: There is no Encoder software. There is out of date guidance provided on how to export your mix for upload to YouTube, etc. However you would use FB360 Encoder or Spatial Media Metadata Injector tools - see AudioEase’s 360pan suite page, the “Export to YouTube” and beyond sections for info on this.

Mac vs PC: See above.

Quest compatible: Yes, with Oculus Link using Ambi Eyes. PC-only of course, as Oculus Link is PC only.

Manuals and Template sessions: The manual is here. It is out of date, referencing both old versions of the plug-in, and obsolete methods to export to YouTube.


Nuendo (using its built-in ambisonic tools)

Link: www.steinberg.net/nuendo/

Cost: ££ (check out the Competitive Crossgrades and annual sales, which make it a lot more affordable)

In active development: Yes, subsidiary of Yamaha

Issues: the GoPro 360 Video Player that Nuendo uses has been discontinued.

Overview: Nuendo has built-in support for ambisonic workflows. It comes with the VST MultiPanner built-in. There is a built-in head tracking module that allows a headset to be connected to it. A 3dconnexion CAD mouse can also be used. The VST AmbiDecoder plug-in is used to produce a headphone version to listen on. But its dependence on GoPro’s discontinued 360 Video Player makes this an uneasy solution. From some basic googling, GoPRo’s software was discontinued in 2018 so it seems a little disingenuous that this software is being marketed as fit for this purpose.

Panner: The VST MultiPanner has a fairly simple interface, and offers basic panning functionality without any added extras.



Video Player & Audtioning: The VST AmbiDecoder plug-in is used to produce a headphone version to listen on., which can receive head-tracking from VR headset, 3d mouse and others. Nuendo can display a 360 video flattened out using it’s built-in video player, but to get a viewport that you can mouse around, they currently specify that you need the GoPro VR Player to do this. Unfortunately GoPro have discontinued this software. I’ve managed to download the software from here, but it was last updated in 2018 so it’s unlikely to stay working as operating systems evolve. Nuendo also has built-in support for dearVR’s Spatial Connect app for viewing 360 Video’s, which again is PC only.

Encoding: There is no Encoder software. There isn’t really any guidance provided on how to export your mix for upload to YouTube, etc. However you would use FB360 Encoder or Spatial Media Metadata Injector tools - see AudioEase’s 360pan suite page, the “Export to YouTube” and beyond sections for info on this.

Mac vs PC: Whilst you can use Nuendo on a Mac or PC, because the head tracking-headset link is built-in to Nuendo, and Oculus Link is only available for PC, you cannot run Nuendo and be connected to a headset on a Mac. If you are Mac based you would be better served running FB360 on your Nuendo Mac, and then use a PC with FB360 video player connected to your headset. It looks like previously you could use the discontinued GoPro VR video player on a second computer to control Nuendo on a Mac, but as it’s discontinued that’s a moot point.

Quest compatible: Yes, the built-in head tracking module can connect with Oculus Link. But there is currently no way to view a 360 video.

Manuals and Template sessions: The manual is here. There are no demo templates, but there are some YouTube videos that walk you through, such as here.


Handy hints

Mixing VR in-headset can be quite fiddly, when you are jumping in and out of a VR headset, and editing on your DAW. An alternative to this is to use Oculus Link and share your desktop - this allows your DAW to be visible to you in your headset. It’s not perfect, and it’s not as good as dearVR Spatial Connect, but it can speed up the workflow. Check out this video which demonstrates it in Nuendo.

Use a bit of electrical tape to cover the proximity sensor inside the Quests’s headset (in the middle, above the two lenses), which means you can take the headset on & off without it going to sleep.

FF-Works is a handy tool for converting movie files, and adding audio in and out. It’s quite hardcore though. An alternative is ER Media ToolKit Pro, which as well as doing al the converting and stuff, can also burn timecode and logos onto a movie file too. It’s available for Mac and PC, but does require an iLok.


What do I use?

I’m currently using the FB360 Panner and Control system. I am then supplementing that with Audio Ease’s 360limiter and 360reverb plug-ins. DearVR hasn’t made it into my workflow yet but I’m considering getting Spatial Connect and can imagine using it more fully then. I’ve predominantly used Logic Pro for editing to date, but as it doesn’t properly support ambisonics (you are limited to using first-order ambisonics, which isn’t great, by fooling Logic into thinking it’s a Quadrophonic surround system). I’ve been using Reaper as a DAW to mix ambisonics, but I’m not a huge fan of its visual aesthetic, and am in the process of replacing both Logic and Reaper with Nuendo.