RSS RSS RSS YouTube SoundCloud DeviantArt Bandcamp Twitter Vimeo Vidme

Flash Mp3 Player, Spectrum Display

The audio track player on the site recently underwent a series of revisions. Not only did I upgrade the flash app that handles playing the audio file
within the web page for each track but I implemented it in such a way that Javascript and Flash work together in a well-integrated fashion. All the controls
use Javascript, CSS, and HTML but call functionality from within the Flash-based player for each track. The Javascript also makes sure that no more than one
track plays at a time.

Quite a few new Actionscript 3 tricks were learned, here. Not only did I really get down and dirty with ExternalInterface.call() and ExternalInterface.addCallback()
but I found out some... frustrating facts about the differences between how Internet Explorer and Mozilla Firefox (and all other "real" web browsers) handle Flash
objects in a web page.

First of all, let's look at the javascript used to coordinate playing the audio tracks in the page through the Flash objects.

var currentPlayingTrack = false; // Variable to keep track of the currently playing track. If nothing is playing, it is false.
var player_obj;
var prev_player_obj;
var play_button;
var prev_play_button;

// This function starts or pauses playback for a specified track's Flash-based player in a track listing page.
function playTrack(track_ord){
// Get the player object of the currently playing track, if applicable.
if((currentPlayingTrack != false) && (currentPlayingTrack != track_ord)){
if(document.getElementById('track_player_embed_' + currentPlayingTrack)){
prev_player_obj = document.getElementById('track_player_embed_' + currentPlayingTrack);
} else if(document.getElementById('track_player_object_' + currentPlayingTrack)){
prev_player_obj = document.getElementById('track_player_object_' + currentPlayingTrack);
}
// Check to see if the previously played track is still buffering, and, if so, disallow playing another track until it finishes buffering.
if((currentPlayingTrack != false) && (prev_player_obj.isBuffered() == false)){
// alert('You cannot play another track until the current one finishes loading.'); // Might freak people out.
return;
}
prev_player_obj.playBackPause();
prev_play_button = document.getElementById('track_play_button_' + currentPlayingTrack);
prev_play_button.className = 'track_play_button';
}
// Get the player object of the track to be played / stopped now.
if(document.getElementById('track_player_embed_' + track_ord)){
player_obj = document.getElementById('track_player_embed_' + track_ord);
} else if(document.getElementById('track_player_object_' + track_ord)){
player_obj = document.getElementById('track_player_object_' + track_ord);
}
player_obj.playOrPause();
currentPlayingTrack = track_ord;
var play_button = document.getElementById('track_play_button_' + track_ord);
if(play_button.className == 'track_play_button'){
play_button.className = 'track_pause_button';
} else {
play_button.className = 'track_play_button';
}
}

// Resets a track play button. Called from within the track's Flash-based player.
function resetTrackPlayButton(track_ord){
document.getElementById('track_play_button_' + track_ord).className = 'track_play_button';
}

// Updatest the track playback position during playback.
function updateTrackPlayBackPosition(track_ord,position){
document.getElementById('track_position_' + track_ord).innerHTML = position;
}

As appears in the code, some variables are set to keep track of what audio, if any, is currently playing and if something is buffering, don't allow switching
tracks until the current / previous track finishes buffering. This is, as you will see below, because the internal commands in the Flash objects are not established
until after the audio is loaded.

One new thing I wanted to try with this revised player was to use spectrum analysis on the audio. That is, show the volume levels for a certain amount of
frequency bands within the audio, using a quick, non-FFT method of analyzing the sound. I came up with two different methods:


  • Extract the raw audio data from the stream and take amplitude samples at different intervals to create the frequency bands

  • Use Flash's built-in functionality to compute the spectrum.

To be completely honest, I couldn't determine which method was more accurate when it came to properly representing the spectrum in the audio for the given
frame rate, but the second method had a couple of very clear advantages over the first.

Before I get into that, however, let's take a look at my first version of the code. This code relies on existing movie clips in the Flash file, which I won't
describe too in-depth other than they use timelines of frames 1 through 11 for the meter levels and they contain a fixed number of frequency band level meters.

var sourceSnd:Sound = new Sound();
var outputSnd:Sound = new Sound();
var channel:SoundChannel = new SoundChannel();
var urlReq:URLRequest = new URLRequest(stage.loaderInfo.parameters.soundurl); // soundurl
var playBackState:Boolean = false;
var loadTriggered:Boolean = false;
var wasReset:Boolean = false;
var pausePoint:Number = 0;
var track_ord = stage.loaderInfo.parameters.trackord; // The ID of the track, supplied externally.
var i:Number = 0; // Generic counter.

// Ititially hide the spectrum and progress bar.
spectrum.visible = false;
progress.visible = false;

// Stop the spectrum animations initially
for(i=1;i<=5;i++){
spectrum['s' + i].stop();
}

// Stop the position and progress bars.
progress.loaded_mask.stop();
progress.position_mask.stop();

// Expose the play / pause toggle function.
ExternalInterface.addCallback("playOrPause",playbackToggle);
ExternalInterface.addCallback("playBackPause",playBackPause);
ExternalInterface.addCallback("isBuffered",isBuffered);

function isBuffered(){
return (sourceSnd.bytesLoaded == sourceSnd.bytesTotal);
}

function loaded(event:Event):void{
outputSnd.addEventListener(SampleDataEvent.SAMPLE_DATA, processSound);
channel = outputSnd.play(pausePoint);
channel.addEventListener(Event.SOUND_COMPLETE, resetPlayButton);
playBackState = true;
}

function processSound(event:SampleDataEvent):void{
var bytes:ByteArray = new ByteArray();
if(wasReset == true){
sourceSnd.extract(bytes, 2048, 0);
event.data.position = 0;
wasReset = false;
} else {
sourceSnd.extract(bytes, 2048);
}
event.data.writeBytes(bytes);
updateSpectrum(bytes);
}

function loadSound(){
sourceSnd.load(urlReq);
sourceSnd.addEventListener(Event.COMPLETE, loaded);
addEventListener(Event.ENTER_FRAME, progressHandler);
}

function resetPlayButton(event:Event){
ExternalInterface.call("resetTrackPlayButton",track_ord);
playBackState = false;
pausePoint = 0;
spectrum.visible = false;
progress.visible = false;
wasReset = true;
ExternalInterface.call("updateTrackPlayBackPosition",track_ord,''); // Clear position display in javascript
}

function progressHandler(event){
var loadTime = sourceSnd.bytesLoaded / sourceSnd.bytesTotal;
var time_display:String = '';
progress.loaded_mask.gotoAndStop(Math.round(600 * loadTime));
progress.position_mask.gotoAndStop(Math.round(600 * (channel.position / sourceSnd.length)));
if(playBackState == true){
// Time position display.
var time_seconds = Math.floor(channel.position / 1000);
var time_minutes = Math.floor(time_seconds / 60);
var time_hours = Math.floor(time_seconds / 3600);
var display_seconds = time_seconds % 60;
if(display_seconds < 10){
display_seconds = '0' + String(display_seconds);
}
var display_minutes = time_minutes % 60;
if(display_minutes < 10){
display_minutes = '0' + String(display_minutes);
}
if(time_hours == 0){
time_display = ' / ' + String(display_minutes) + ':' + String(display_seconds);
} else {
time_display = ' / ' + String(time_hours) + ':' + String(display_minutes) + ':' + String(display_seconds);
}
ExternalInterface.call("updateTrackPlayBackPosition",track_ord,time_display);
}
}

function playbackToggle(){
if(playBackState == false){
if(loadTriggered == false){
loadSound();
loadTriggered = true;
spectrum.visible = true;
progress.visible = true;
} else {
channel = outputSnd.play(pausePoint);
channel.addEventListener(Event.SOUND_COMPLETE, resetPlayButton);
playBackState = true;
spectrum.visible = true;
progress.visible = true;
}
} else {
playBackPause();
}
}

function playBackPause(){
pausePoint = channel.position;
channel.stop();
playBackState = false;
spectrum.visible = false;
progress.visible = false;
ExternalInterface.call("updateTrackPlayBackPosition",track_ord,''); // Clear position display in javascript
}

function updateSpectrum(soundData:ByteArray){

// Sample the data at 5 different intervals; each a multiple of 8.
var soundLevels:Array = new Array(0,0,0,0,0,0); // Initialize the spectrum levels.
var currentLevel:Number = 0;
soundData.position = 0;
while(soundData.bytesAvailable > 16){
currentLevel = soundData.readFloat();
if(soundData.position % 1024 == 0){
soundLevels[1] += currentLevel;
} else if(soundData.position % 256 == 0){
soundLevels[2] += currentLevel;
} else if(soundData.position % 128 == 0){
soundLevels[3] += currentLevel;
} else if(soundData.position % 32 == 0){
soundLevels[4] += currentLevel;
} else if(soundData.position % 4 == 0){
soundLevels[5] += currentLevel;
}
soundData.position++;
}

// Compensate amplitudes for frequenices (the existing cumulative amplitudes are proportional to the frequency)
soundLevels[1] *= 32;
soundLevels[2] *= 16;
soundLevels[3] *= 8;
soundLevels[4] *= 4;
soundLevels[5] *= 1;

// Limit indicator levels to 10.
for(i=1;i<=5;i++){
if(soundLevels[i] > 10){
soundLevels[i] = 10;
}
}

// Translate the band amplitudes to spectrum meter displays.
var meterLevels:Array = new Array(0,1,1,1,1,1);
for(i=1;i<=5;i++){
meterLevels[i] = Math.round(Math.abs(soundLevels[i]));
spectrum['s' + i].gotoAndStop(meterLevels[i]);
}
}

As you can see, I have to create two separate sound objects and copy the audio from the first, which loads the audio from the file, to a second sound
object which actually plays the audio and fires off the required sample data events that trigger the audio extraction. Using extract() doesn't work with
a single sound object because it doesn't fire off the sample data events. This is the first drawback of using the first version of the player.

The second and most problematic disadvantage of the first method is that it has to load the entire audio file to be 100% certain that it doesn't
run out of buffer during playback and therefore extract, copy, and play glitched audio as it tries to play catch-up. Even requiring it to wait until it has
loaded, say, 1 MegaByte isn't a full guarantee that it won't glitch up if the connection hiccups later during buffering. This is the reason I tried the second
method.

Here is the second method:

var sourceSnd:Sound = new Sound(); // Sound that is loaded and played as audio.
var channel:SoundChannel = new SoundChannel();
var urlReq:URLRequest = new URLRequest(stage.loaderInfo.parameters.soundurl); // soundurl
var playBackState:Boolean = false;
var loadTriggered:Boolean = false;
var wasReset:Boolean = false;
var pausePoint:Number = 0;
var track_ord = stage.loaderInfo.parameters.trackord; // The ID of the track, supplied externally.
var i:Number = 0; // Generic counter.

// Domain security policy settings
Security.allowDomain("http://content.nitrocosm.com");
Security.loadPolicyFile("http://content.nitrocosm.com/crossdomain.xml");

// Ititially hide the spectrum and progress bar.
spectrum.visible = false;
progress.visible = false;

// Stop the spectrum animations initially
for(i=1;i<=8;i++){
spectrum['s' + i].stop();
}

// Stop the position and progress bars.
progress.loaded_mask.stop();
progress.position_mask.stop();

// Expose the play / pause toggle function.
ExternalInterface.addCallback("playOrPause",playbackToggle);
ExternalInterface.addCallback("playBackPause",playBackPause);
ExternalInterface.addCallback("isBuffered",isBuffered);

function isBuffered(){
return (sourceSnd.bytesLoaded == sourceSnd.bytesTotal);
}

function loaded(event:ProgressEvent):void{
if((playBackState == false) && (sourceSnd.bytesLoaded > 1048576)){
channel = sourceSnd.play(pausePoint);
channel.addEventListener(Event.SOUND_COMPLETE, resetPlayButton);
stage.addEventListener(Event.ENTER_FRAME, progressHandler);
playBackState = true;
}
}

function loadSound(){
sourceSnd.load(urlReq);
sourceSnd.addEventListener(ProgressEvent.PROGRESS, loaded);
}

function resetPlayButton(event:Event){
ExternalInterface.call("resetTrackPlayButton",track_ord);
playBackState = false;
pausePoint = 0;
spectrum.visible = false;
progress.visible = false;
wasReset = true;
ExternalInterface.call("updateTrackPlayBackPosition",track_ord,''); // Clear position display in javascript
}

function progressHandler(event){
var loadTime = sourceSnd.bytesLoaded / sourceSnd.bytesTotal;
progress.loaded_mask.gotoAndStop(Math.round(600 * loadTime));
progress.position_mask.gotoAndStop(Math.round(600 * (channel.position / sourceSnd.length)));
if(playBackState == true){
updateTimeDisplay();
updateSpectrum();
}
}

function playbackToggle(){
if(playBackState == false){
if(loadTriggered == false){
loadSound();
loadTriggered = true;
spectrum.visible = true;
progress.visible = true;
} else {
channel = sourceSnd.play(pausePoint);
channel.addEventListener(Event.SOUND_COMPLETE, resetPlayButton);
playBackState = true;
spectrum.visible = true;
progress.visible = true;
}
} else {
playBackPause();
}
}

function playBackPause(){
pausePoint = channel.position;
channel.stop();
playBackState = false;
spectrum.visible = false;
progress.visible = false;
ExternalInterface.call("updateTrackPlayBackPosition",track_ord,''); // Clear position display in javascript
}

function updateTimeDisplay(){
// Time position display.
var time_display:String = '';
var time_seconds = Math.floor(channel.position / 1000);
var time_minutes = Math.floor(time_seconds / 60);
var time_hours = Math.floor(time_seconds / 3600);
var display_seconds = time_seconds % 60;
if(display_seconds < 10){
display_seconds = '0' + String(display_seconds);
}
var display_minutes = time_minutes % 60;
if(display_minutes < 10){
display_minutes = '0' + String(display_minutes);
}
if(time_hours == 0){
time_display = ' / ' + String(display_minutes) + ':' + String(display_seconds);
} else {
time_display = ' / ' + String(time_hours) + ':' + String(display_minutes) + ':' + String(display_seconds);
}
ExternalInterface.call("updateTrackPlayBackPosition",track_ord,time_display);
}

function updateSpectrum(){
// Get,process, and display the spectrum data.
var spectrumData:ByteArray = new ByteArray();
var curLevel:Number = 0;
SoundMixer.computeSpectrum(spectrumData, false, 0);
for(i=1;i<=8;i++){
curLevel = Math.ceil(Math.abs(spectrumData.readFloat() * 10));
spectrumData.position += 32;
curLevel *= 2;
if(curLevel > 11) curLevel = 11; // It goes to eleven. Seriously, the meters have frames 1 to 11.
spectrum['s' + i].gotoAndStop(curLevel);
}
}

The second method is far cleaner; it's more elegant. It uses the built-in functionality of SoundMixer.computeSpectrum() instead of my rough interval-based
sampling. The only drawback is that I had to set the security to allow cross-domain file loading. This was accomplished by creating a file called "crossdomain.xml"
for the site content sub-domain:

<?xml version="1.0"?>
<!DOCTYPE cross-domain-policy SYSTEM "http://www.adobe.com/xml/dtds/cross-domain-policy.dtd">
<cross-domain-policy>
<site-control permitted-cross-domain-policies="all"/>
<allow-access-from domain="*"/>
</cross-domain-policy>

It's a wide-open policy but it works fine for this sub-domain. It was a little confusing at first; the second player system worked fine on the development
server but wouldn't show the spectrum analyzer on the live site. The music files played fine and the audio worked perfectly, however. After a bit of research,
I discovered that the cross-domain policy was required. I just had to put that crossdomain.xml file in the root directory of the content sub-domain and then add
the lines:

Security.allowDomain("http://content.nitrocosm.com");
Security.loadPolicyFile("http://content.nitrocosm.com/crossdomain.xml");

to the actionscript. The reason the security settings have to be set to allow cross-domain loading is because SoundMixer.ComputeSpectrum() uses stricter security settings than the sound objects and the extract() method do.

On another note, one of the real "gotchas" in Actionscript is that when writing a function, any movie clips or sprites you reference in that function must
exist on the timeline at the time the function is defined, not just when it is called. Not identifying this problem sooner has led to many hours of confusion.

If you need to make your own Flash-based music player, feel free to use my code. I really don't mind and you can modify it any way you want to. If you do use it,
I would like an e-mail to let me know that you found it useful, but you certainly don't have to. You might find some great ways to improve upon it, as well.

I hope you found this information useful.

Progress
OTE Page 8 - Panels 2 and 3 sketched.
12%
August '17 Updates
SMTWTFS
1 2 3 4 5
6 7 8 9 10 11 12
13 14 15 16 17 18 19
20 21 22 23 24 25 26
27 28 29 30 31