Creating a Three.js scene and adding required geometry elements
Reading the sound file content
Analysing the audio data and rendering the scene
Before even starting you can take a look at the end product here.
Create a Three.js scene
To render a 3D scene on the screen we need three main elements as follows.
AudioVisualizer class is going to hold all the things we need to make our awesome visualizer to work. Next is to initialize our three.js elements one by one. The below function creates a ThreeJS Scene, a WebGL renderer, a camera and a light. If you want to get into details with ThreeJS, I recommend to read the tutorial from AeroTwist.com.
Since the 3D scene in place, now we can add 3D geometries to it. createBars function creates BoxGeometries (Cubes) with size 0.5 x 0.5 x 0.5. Each created cubes will be assigned with a random color to make the visualization little more exciting. All the created bars are stored in an array so we can animate them later with the audio data.
Reading sound file content
We now have our 3D scene with custom created 3D bars which will be used to visualize the audio. Next thing is to get the audio file content either from server or from users computer. We can always request a file from server and visualize the sound but it would be much more interesting if users can drag and drop their audio files and see the magic happens before their eyes!
Below piece of code reads the content of the file which user dropped into our browser. I know, HTML 5 is awesome!
Analysing audio data and rendering the scene
Now this is the important part. Using the audio data read in the previous step, we are going to present the data using the 3D scene created earlier. In-order to do the visualization, there are some Web Audio elements we need to create and connect. That’s right CONNECT!
The elements required are:
These nodes should be connected as shown below. We don’t want to get into too details about the functionality of these nodes as we are not planning to do audio production for some Hollywood blockbuster anytime soon ;)
But a word of warning though, According to MDN, Script Processor node is deprecated. You can read more information here.
Note: As of the August 29 2014 Web Audio API spec publication, this feature has been marked as deprecated, and is soon to be replaced by Audio Workers.
Next is to create a source buffer which is will hold the song. The analyser node is the one which is going to provide us information about the audio in real-time so we can render the 3D scene accordingly.
Begin Audio processing
This below method generates a random color everytime we call it. Taken from this stackoverflow question.
Additionally, as you may have notived in the demo, the scene in interactive. You can rotate, zoom-in and zoom-out in the scene. This is because the demo uses a utility library for Three.js called “OrbitControls.js”. You can grab the file here.
In order to make the orbital control work with our scene, add the following line to the “initialize” method.
and the following line to the “onaudioprocess” method.