An example of a first-order ambisonic encoding of a sound source.
The user can control the direction of the source.
The vector intensity visualizer analyzes the sound-field and shows the strongest sound activity.
In this case, as there is only a single source, it just follows it.
An example of a loading FOA ambisonic recordings, with binaural decoding.
The user can control the rotation of the sound scene.
The vector intensity visualizer analyzes the sound-field and shows the strongest sound activity.
In this case, multiple directions are rapidly analyzed according to the sound scene activity.
An example of applying a FOA ambisonic room impulse response to an anechoic sound source, with binaural decoding.
The user can control the rotation of the sound scene.
The vector intensity visualizer analyzes the sound-field and shows the strongest sound activity.
In this case, multiple directions are rapidly analyzed according to the sound scene activity.
An example of a listening to the output of a virtual microphone, inside a FOA ambisonic recording.
The user can control the look-up direction of the microphone, and its pattern.
The vector intensity visualizer analyzes the sound-field and shows the strongest sound activity.
In this case, multiple directions are rapidly analyzed according to the sound scene activity.
An example of a higher-order ambisonic encoding of a sound source. The example demonstrates the effect of the ambisonic order on the perception of the sound source. The user can switch between different sets of decoding filters on-the-fly.
An example of a higher-order ambisonic encoding only using circular harmonics (2D). Same example as above but with 2D classes and horizontal only spatialization. Allows the use of Ambisonics up to 15th order.
An example of a loading HOA ambisonic recordings, with binaural decoding.
The user can control the rotation of the sound scene.
The example demonstrates the effect of the ambisonic order on the perception of the sound source.
The user can switch between different sets of decoding filters on-the-fly.
An example of applying a HOA ambisonic room impulse response to an anechoic sound source, with binaural decoding.
The user can control the rotation of the sound scene.
The vector intensity visualizer analyzes the sound-field and shows the strongest sound activity.
In this case, multiple directions are rapidly analyzed according to the sound scene activity.
An example of a listening to the output of a virtual microphone, inside a HOA ambisonic recording.
The user can control the look-up direction of the microphone, and its pattern.
The vector intensity visualizer analyzes the sound-field and shows the strongest sound activity.
In this case, multiple directions are rapidly analyzed according to the sound scene activity.
An example of a higher-order ambisonic encoding of a sound source and decoding to an array of speakers. The example demonstrates how to use the web audio API for rendering on a multi-channel architecture.
An example of a loading HOA ambisonic recordings, decoded on a mutli-channel array of speakers. The example demonstrates how to use the web audio API for rendering on a multi-channel architecture.
The following example combines spherical video
rendering, Google-Carboard style, with head-tracking based on the
smartphone/tablet orientation sensors, and respective ambisonic rendering
using JSAmbisonics.
The WebGL/THREEJS code for the visuals is based on the tutorial code
available here.
The audio/video recording is taken in Helsinki Concert Hall (Musiikkitalo), on a Brahms piece rehearsed by the Sibelius Academy Symphony Orchestra.
This is the same as the HOA panner example above, but in this example integration with SOFA HRTFs is demonstrated. Instead of loading or switching the decoding filters directly, as in the previous examples, different sets of HRTFs in the SOFA format are loaded and switched, and the decoding filters are generated automatically for the specified order.
This is the same as the HOA virtual microphone example above, but with an additional visualization of the rotated microphone pattern using WebGL.
This is the same as the FOA player but using the HTML audio element to stream audio. Useful to e.g. stream large audio files on mobile devices.
(Not intended for mobiles)
Test script that goes through all the objects in the library, initializes them and logs information on the console - useful to see if the browser supports the library.