Changes

Jump to: navigation, search

User:David.humphrey/Audio Data API 2

692 bytes added, 00:30, 21 May 2010
Reading Audio
Audio data is made available via an event-based API. As the audio is played, and therefore decoded, each frame is passed to content scripts for processing after being written to the audio layer--hence the name, '''AudioWritten'''. Playing and pausing the audio all affect the streaming of this raw audio data as well.
Consumers of this raw audio data register a callback two callbacks on the <audio> or <video> element like soin order to consume this data:
<pre>
<audio src="song.ogg" onloadedmetadata="audioInfo(event);" onaudiowritten="audioWritten(event);"></audio>
</pre>
The '''LoadedMetadata''' event is a standard part of HTML5, and has been extended to provide more detailed information about the audio stream. Specifically, developers can obtain the number of channels and sample rate per second of the audio. This event is fired once as the media resource is first loaded, and is useful for interpreting or writing the audio data. The '''AudioWritten ''' event provides two pieces of data. The first is a framebuffer (i.e., an array) containing sample data for the current frame. The second is the time (e.g., milliseconds) for the start of this frame. The following is an example of how both events might be used:
<pre>
var channels, rate, samples; function audioInfo(event) { channels = event.mozChannels; rate = event.mozRate;}
function audioWritten(event) {
for (var i=0, slen=samples.length; i<slen; i++) {
// Do something with the audio data as it is played. processSample(samples[i], channels, rate);
}
}
Confirm
656
edits

Navigation menu