api

Jump to: navigation, search

API pages needing examples

Total: 428

Path Summary Modification date
apis/webrtc/RTCPeerConnection/onremovestream Handles the removeStream event for when setRemoteDescription() is called to remove a MediaStream object on the remote peer. 23 July 2015 01:35:28
apis/audio-video/video HTML Video element allows creator of a HTML document or view of a Web Application to embed a video for display when a user visits the view or opens HTML document in a web browser. 27 April 2015 20:36:26
apis/webaudio/ScriptProcessorNode/onaudioprocess An event listener which is called periodically for audio processing. An event of type AudioProcessingEvent will be passed to the event handler. Deprecated; deletion candidate. See http://webaudio.github.io/web-audio-api/. 31 December 2014 19:49:55
apis/webaudio/ScriptProcessorNode/bufferSize The size of the buffer (in sample-frames) which needs to be processed each time onaudioprocess is called. Legal values are 256, 512, 1024, 2048, 4096, 8192, and 16384. Deprecated; deletion candidate. See http://webaudio.github.io/web-audio-api/. 31 December 2014 19:49:09
apis/webaudio/OscillatorNode/setWaveTable Sets an arbitrary custom periodic waveform given a WaveTable. Not in spec; deletion candidate. See http://webaudio.github.io/web-audio-api/. 31 December 2014 19:17:53
apis/webaudio/OscillatorNode/playbackState The playback state, initialized to UNSCHEDULED_STATE, progressing through SCHEDULED_STATE, PLAYING_STATE, and FINISHED_STATE. Not in spec; deletion candidate. See http://webaudio.github.io/web-audio-api/. 31 December 2014 19:17:05
apis/webaudio/AudioProcessingEvent/playbackTime The time when the audio will be played, in the same time coordinate system as AudioContext.currentTime. playbackTime allows for very tight synchronization between processing directly in JavaScript with the other events in the context's rendering graph. Deprecated; deletion candidate. See http://webaudio.github.io/web-audio-api/. 31 December 2014 18:33:07
apis/webaudio/AudioProcessingEvent/outputBuffer An AudioBuffer where the output audio data should be written. It will have a number of channels equal to the numberOfOutputChannels parameter of the createScriptProcessor() method. Script code within the scope of the onaudioprocess function is expected to modify the Float32Array arrays representing channel data in this AudioBuffer. Any script modifications to this AudioBuffer outside of this scope will not produce any audible effects. Deprecated; deletion candidate. See http://webaudio.github.io/web-audio-api/. 31 December 2014 18:32:23
apis/webaudio/AudioProcessingEvent/node The ScriptProcessorNode associated with this processing event. Deprecated; deletion candidate. See http://webaudio.github.io/web-audio-api/. 31 December 2014 18:31:36
apis/webaudio/AudioProcessingEvent/inputBuffer An AudioBuffer containing the input audio data. It will have a number of channels equal to the numberOfInputChannels parameter of the createScriptProcessor() method. This AudioBuffer is only valid while in the scope of the onaudioprocess function. Its values will be meaningless outside of this scope. Deprecated; deletion candidate. See http://webaudio.github.io/web-audio-api/. 31 December 2014 18:29:47
apis/webaudio/AudioParam/minValue Nominal minimum value. The value attribute may be set lower than this value. Not in spec; deletion candidate. See http://webaudio.github.io/web-audio-api/. 31 December 2014 18:17:59
apis/webaudio/AudioParam/maxValue Nominal maximum value. The value attribute may be set higher than this value. Not in spec; deletion candidate. See http://webaudio.github.io/web-audio-api/. 31 December 2014 18:17:07
apis/webaudio/AudioParam/computedValue The final value controlling the audio DSP, calculated at each time, which is either the value set directly to the value attribute or, if there are any scheduled parameter changes (automation events), the value as calculated from these events. Not in spec; deletion candidate. See http://webaudio.github.io/web-audio-api/. 31 December 2014 18:06:57
apis/webaudio/AudioDestinationNode/numberOfChannels The number of channels of the destination's input. This value will default to 2, and may be set to any non-zero value less than or equal to maxChannelCount. An exception will be thrown if this value is not within the valid range. Not in spec; deletion candidate. Possibly confused with AudioBuffer/numberOfChannels. See http://webaudio.github.io/web-audio-api/. 29 December 2014 21:23:51
apis/webaudio/AudioContext/createWaveTable Creates a WaveTable representing a waveform containing arbitrary harmonic content. The real and imag parameters must be of type Float32Array of equal lengths greater than zero and less than or equal to 4096 or an exception will be thrown. These parameters specify the Fourier coefficients of a Fourier series representing the partials of a periodic waveform. The created WaveTable will be used with an OscillatorNode and will represent a normalized time-domain waveform having maximum absolute peak value of 1. Another way of saying this is that the generated waveform of an OscillatorNode will have maximum peak value at 0dBFS. Conveniently, this corresponds to the full-range of the signal values used by the Web Audio API. Because the WaveTable will be normalized on creation, the real and imag parameters represent relative values. Out of date; removed from spec. See http://webaudio.github.io/web-audio-api/. 24 December 2014 18:56:25
More results