69
edits
No edit summary |
|||
Line 35: | Line 35: | ||
* Added mozChannels, mozSampleRate, mozFrameBufferLength to nsHTMLMediaElement | * Added mozChannels, mozSampleRate, mozFrameBufferLength to nsHTMLMediaElement | ||
* Removed mozSetFrameBufferLength from nsHTMLAudioElement | * Removed mozSetFrameBufferLength from nsHTMLAudioElement | ||
* Converted mozTime (in | * Converted mozTime (in AudioWAvailable event) to float value (seconds instead of ms) | ||
Demos written for the previous version are '''not''' compatible, though can be made to be quite easily. See details below. | Demos written for the previous version are '''not''' compatible, though can be made to be quite easily. See details below. | ||
Line 45: | Line 45: | ||
===== Reading Audio ===== | ===== Reading Audio ===== | ||
Audio data is made available via an event-based API. As the audio is played, and therefore decoded, sample data is passed to content scripts in a framebuffer for processing after | Audio data is made available via an event-based API. As the audio is played, and therefore decoded, sample data is passed to content scripts in a framebuffer for processing after becoming available to the audio layer--hence the name, '''AudioAvailable'''. These samples may or may not have been played yet at the time of the event. The audio samples returned in the event are raw, and have not been adjusted for mute/volume settings on the media element. Playing and pausing the audio also affect the streaming of this raw audio data. | ||
Users of this API can register two callbacks on the <audio> or <video> element in order to consume this data: | Users of this API can register two callbacks on the <audio> or <video> element in order to consume this data: | ||
Line 52: | Line 52: | ||
<audio src="song.ogg" | <audio src="song.ogg" | ||
onloadedmetadata="audioInfo();" | onloadedmetadata="audioInfo();" | ||
onaudioavailable="audioAvailable(event);"> | |||
</audio> | </audio> | ||
</pre> | </pre> | ||
Line 62: | Line 62: | ||
* mozFrameBufferLength | * mozFrameBufferLength | ||
Prior to the '''LoadedMetadata''' event, these attributes will return 0 (zero), indicating that they are not known, or there is no audio. These attributes indicate the '''number of channels''', audio '''sample rate per second''', and the '''default size of the framebuffer''' that will be used in ''' | Prior to the '''LoadedMetadata''' event, these attributes will return 0 (zero), indicating that they are not known, or there is no audio. These attributes indicate the '''number of channels''', audio '''sample rate per second''', and the '''default size of the framebuffer''' that will be used in '''AudioAvailable''' events. This event is fired once as the media resource is first loaded, and is useful for interpreting or writing the audio data. | ||
The ''' | The '''AudioAvailable''' event provides two pieces of data. The first is a framebuffer (i.e., an array) containing decoded audio sample data (i.e., floats). The second is the time for these samples measured from the start in seconds. | ||
The following is an example of how both events might be used: | The following is an example of how both events might be used: | ||
Line 83: | Line 83: | ||
} | } | ||
function | function audioAvailable(event) { | ||
var samples = event.mozFrameBuffer; | var samples = event.mozFrameBuffer; | ||
var time = event.mozTime; | var time = event.mozTime; | ||
Line 111: | Line 111: | ||
controls="true" | controls="true" | ||
onloadedmetadata="loadedMetadata();" | onloadedmetadata="loadedMetadata();" | ||
onaudioavailable="audioAvailable(event);" | |||
style="width: 512px;"> | style="width: 512px;"> | ||
</audio> | </audio> | ||
Line 130: | Line 130: | ||
} | } | ||
function | function audioAvailable(event) { | ||
var fb = event.mozFrameBuffer, | var fb = event.mozFrameBuffer, | ||
signal = new Float32Array(fb.length / channels), | signal = new Float32Array(fb.length / channels), | ||
Line 272: | Line 272: | ||
// Write samples using a JS Array | // Write samples using a JS Array | ||
var samples = [0.242, 0.127, 0.0, -0.058, -0.242, ...]; | var samples = [0.242, 0.127, 0.0, -0.058, -0.242, ...]; | ||
var | var numberSamplesAvailable = audioOutput.mozWriteAudio(samples); | ||
// Write samples using a Typed Array | // Write samples using a Typed Array | ||
var samples = new Float32Array([0.242, 0.127, 0.0, -0.058, -0.242, ...]); | var samples = new Float32Array([0.242, 0.127, 0.0, -0.058, -0.242, ...]); | ||
var | var numberSamplesAvailable = audioOutput.mozWriteAudio(samples); | ||
</pre> | </pre> | ||
Line 282: | Line 282: | ||
<pre> | <pre> | ||
// Get current position of the underlying audio stream, measured in samples | // Get current position of the underlying audio stream, measured in samples available. | ||
var currentSampleOffset = audioOutput.mozCurrentSampleOffset(); | var currentSampleOffset = audioOutput.mozCurrentSampleOffset(); | ||
</pre> | </pre> | ||
Since the ''' | Since the '''AudioAvailable''' event and the '''mozWriteAudio()''' method both use '''Float32Array''', it is possible to take the output of one audio stream and pass it directly (or process first and then pass) to a second: | ||
<pre> | <pre> | ||
Line 292: | Line 292: | ||
src="song.ogg" | src="song.ogg" | ||
onloadedmetadata="loadedMetadata();" | onloadedmetadata="loadedMetadata();" | ||
onaudioavailable="audioAvailable(event);" | |||
controls="controls"> | controls="controls"> | ||
</audio> | </audio> | ||
Line 307: | Line 307: | ||
} | } | ||
function | function audioAvailable(event) { | ||
// Write the current framebuffer | // Write the current framebuffer | ||
var frameBuffer = event.mozFrameBuffer; | var frameBuffer = event.mozFrameBuffer; | ||
Line 439: | Line 439: | ||
== DOM Implementation == | == DOM Implementation == | ||
===== | ===== nsIDOMNotifyAudioAvailableEvent ===== | ||
Audio data is made available via the following event: | Audio data is made available via the following event: | ||
* '''Event''': | * '''Event''': AudioAvailableEvent | ||
* '''Event handler''': | * '''Event handler''': onaudioavailable | ||
The ''' | The '''AudioAvailableEvent''' is defined as follows: | ||
<pre> | <pre> | ||
interface | interface nsIDOMNotifyAudioAvailableEvent : nsIDOMEvent | ||
{ | { | ||
// mozFrameBuffer is really a Float32Array, via dom_quickstubs | // mozFrameBuffer is really a Float32Array, via dom_quickstubs | ||
Line 475: | Line 475: | ||
The '''mozChannels''' attribute contains the number of channels in the audio resource (e.g., 2). The '''mozSampleRate''' attribute contains the number of samples per second that will be played, for example 44100. Both are readonly. | The '''mozChannels''' attribute contains the number of channels in the audio resource (e.g., 2). The '''mozSampleRate''' attribute contains the number of samples per second that will be played, for example 44100. Both are readonly. | ||
The '''mozFrameBufferLength''' attribute indicates the number of samples that will be returned in the framebuffer of each ''' | The '''mozFrameBufferLength''' attribute indicates the number of samples that will be returned in the framebuffer of each '''AudioAvailable''' event. This number is a total for all channels, and by default is set to be the number of channels * 1024 (e.g., 2 channels * 1024 samples = 2048 total). | ||
The '''mozFrameBufferLength''' attribute can also be set to a new value, if users want lower latency, or larger amounts of data, etc. The size you give '''must''' be a power of 2 between 512 and 32768. The following are all valid lengths: | The '''mozFrameBufferLength''' attribute can also be set to a new value, if users want lower latency, or larger amounts of data, etc. The size you give '''must''' be a power of 2 between 512 and 32768. The following are all valid lengths: | ||
Line 487: | Line 487: | ||
* 32768 | * 32768 | ||
Using any other size will result in an exception being thrown. The best time to set a new length is after the '''loadedmetadata''' event fires, when the audio info is known, but before the audio has started or ''' | Using any other size will result in an exception being thrown. The best time to set a new length is after the '''loadedmetadata''' event fires, when the audio info is known, but before the audio has started or '''AudioAvailable''' events begun firing. | ||
===== nsIDOMHTMLAudioElement additions ===== | ===== nsIDOMHTMLAudioElement additions ===== |
edits