OfflineAudioContext

Overview

The OfflineAudioContext interface is an AudioContext interface representing an audio-processing graph built from linked together AudioNodes. In contrast with a standard AudioContext, an OfflineAudioContext doesn't render the audio to the device hardware; instead, it generates it, as fast as it can, and outputs the result to an AudioBuffer.

Location

 

createAnalyser    (declared in BaseAudioContext)

Creates an AnalyserNode, which can be used to expose audio time and frequency data and for example to create data visualisations.

 

method createAnalyser: AnalyserNode

 

AnalyserNode createAnalyser()

 

func createAnalyser() -> AnalyserNode

 

AnalyserNode createAnalyser()

 

Function createAnalyser() As AnalyserNode

createBiquadFilter    (declared in BaseAudioContext)

Creates a BiquadFilterNode, which represents a second order filter configurable as several different common filter types: high-pass, low-pass, band-pass, etc

 

method createBiquadFilter: BiquadFilterNode

 

BiquadFilterNode createBiquadFilter()

 

func createBiquadFilter() -> BiquadFilterNode

 

BiquadFilterNode createBiquadFilter()

 

Function createBiquadFilter() As BiquadFilterNode

createBufferSource    (declared in BaseAudioContext)

Creates an AudioBufferSourceNode, which can be used to play and manipulate audio data contained within an AudioBuffer object. AudioBuffers are created using AudioContext.createBuffer or returned by AudioContext.decodeAudioData when it successfully decodes an audio track.

 

method createBufferSource: AudioBufferSourceNode

 

AudioBufferSourceNode createBufferSource()

 

func createBufferSource() -> AudioBufferSourceNode

 

AudioBufferSourceNode createBufferSource()

 

Function createBufferSource() As AudioBufferSourceNode

createChannelMerger    (declared in BaseAudioContext)

Creates a ChannelMergerNode, which is used to combine channels from multiple audio streams into a single audio stream.

 

method createChannelMerger(parnumberOfInputs: dynamic): ChannelMergerNode

 

ChannelMergerNode createChannelMerger(dynamic parnumberOfInputs)

 

func createChannelMerger(_ parnumberOfInputs: dynamic) -> ChannelMergerNode

 

ChannelMergerNode createChannelMerger(dynamic parnumberOfInputs)

 

Function createChannelMerger(parnumberOfInputs As dynamic) As ChannelMergerNode

Parameters:

  • parnumberOfInputs:

createChannelSplitter    (declared in BaseAudioContext)

Creates a ChannelSplitterNode, which is used to access the individual channels of an audio stream and process them separately.

 

method createChannelSplitter(parnumberOfOutputs: dynamic): ChannelSplitterNode

 

ChannelSplitterNode createChannelSplitter(dynamic parnumberOfOutputs)

 

func createChannelSplitter(_ parnumberOfOutputs: dynamic) -> ChannelSplitterNode

 

ChannelSplitterNode createChannelSplitter(dynamic parnumberOfOutputs)

 

Function createChannelSplitter(parnumberOfOutputs As dynamic) As ChannelSplitterNode

Parameters:

  • parnumberOfOutputs:

createConstantSource    (declared in BaseAudioContext)

Creates a ConstantSourceNode object, which is an audio source that continuously outputs a monaural (one-channel) sound signal whose samples all have the same value.

 

method createConstantSource: ConstantSourceNode

 

ConstantSourceNode createConstantSource()

 

func createConstantSource() -> ConstantSourceNode

 

ConstantSourceNode createConstantSource()

 

Function createConstantSource() As ConstantSourceNode

createConvolver    (declared in BaseAudioContext)

Creates a ConvolverNode, which can be used to apply convolution effects to your audio graph, for example a reverberation effect.

 

method createConvolver: ConvolverNode

 

ConvolverNode createConvolver()

 

func createConvolver() -> ConvolverNode

 

ConvolverNode createConvolver()

 

Function createConvolver() As ConvolverNode

createDelay    (declared in BaseAudioContext)

Creates a DelayNode, which is used to delay the incoming audio signal by a certain amount. This node is also useful to create feedback loops in a Web Audio API graph.

 

method createDelay(parmaxDelayTime: dynamic): DelayNode

 

DelayNode createDelay(dynamic parmaxDelayTime)

 

func createDelay(_ parmaxDelayTime: dynamic) -> DelayNode

 

DelayNode createDelay(dynamic parmaxDelayTime)

 

Function createDelay(parmaxDelayTime As dynamic) As DelayNode

Parameters:

  • parmaxDelayTime:

createDynamicsCompressor    (declared in BaseAudioContext)

Creates a DynamicsCompressorNode, which can be used to apply acoustic compression to an audio signal.

 

method createDynamicsCompressor: DynamicsCompressorNode

 

DynamicsCompressorNode createDynamicsCompressor()

 

func createDynamicsCompressor() -> DynamicsCompressorNode

 

DynamicsCompressorNode createDynamicsCompressor()

 

Function createDynamicsCompressor() As DynamicsCompressorNode

createGain    (declared in BaseAudioContext)

Creates a GainNode, which can be used to control the overall volume of the audio graph.

 

method createGain: GainNode

 

GainNode createGain()

 

func createGain() -> GainNode

 

GainNode createGain()

 

Function createGain() As GainNode

createIIRFilter    (declared in BaseAudioContext)

Creates an IIRFilterNode, which represents a second order filter configurable as several different common filter types.

 

method createIIRFilter(parfeedforward: dynamic; parfeedback: dynamic): IIRFilterNode

 

IIRFilterNode createIIRFilter(dynamic parfeedforward, dynamic parfeedback)

 

func createIIRFilter(_ parfeedforward: dynamic, _ parfeedback: dynamic) -> IIRFilterNode

 

IIRFilterNode createIIRFilter(dynamic parfeedforward, dynamic parfeedback)

 

Function createIIRFilter(parfeedforward As dynamic, parfeedback As dynamic) As IIRFilterNode

Parameters:

  • parfeedforward:
  • parfeedback:

createOscillator    (declared in BaseAudioContext)

Creates an OscillatorNode, a source representing a periodic waveform. It basically generates a tone.

 

method createOscillator: OscillatorNode

 

OscillatorNode createOscillator()

 

func createOscillator() -> OscillatorNode

 

OscillatorNode createOscillator()

 

Function createOscillator() As OscillatorNode

createPanner    (declared in BaseAudioContext)

Creates a PannerNode, which is used to spatialise an incoming audio stream in 3D space.

 

method createPanner: PannerNode

 

PannerNode createPanner()

 

func createPanner() -> PannerNode

 

PannerNode createPanner()

 

Function createPanner() As PannerNode

createPeriodicWave    (declared in BaseAudioContext)

Creates a PeriodicWave, used to define a periodic waveform that can be used to determine the output of an OscillatorNode.

 

method createPeriodicWave(parreal: dynamic; parimag: dynamic): PeriodicWave

 

PeriodicWave createPeriodicWave(dynamic parreal, dynamic parimag)

 

func createPeriodicWave(_ parreal: dynamic, _ parimag: dynamic) -> PeriodicWave

 

PeriodicWave createPeriodicWave(dynamic parreal, dynamic parimag)

 

Function createPeriodicWave(parreal As dynamic, parimag As dynamic) As PeriodicWave

Parameters:

  • parreal:
  • parimag:

createStereoPanner    (declared in BaseAudioContext)

Creates a StereoPannerNode, which can be used to apply stereo panning to an audio source.

 

method createStereoPanner: StereoPannerNode

 

StereoPannerNode createStereoPanner()

 

func createStereoPanner() -> StereoPannerNode

 

StereoPannerNode createStereoPanner()

 

Function createStereoPanner() As StereoPannerNode

createWaveShaper    (declared in BaseAudioContext)

Creates a WaveShaperNode, which is used to implement non-linear distortion effects.

 

method createWaveShaper: WaveShaperNode

 

WaveShaperNode createWaveShaper()

 

func createWaveShaper() -> WaveShaperNode

 

WaveShaperNode createWaveShaper()

 

Function createWaveShaper() As WaveShaperNode

currentTime    (declared in BaseAudioContext)

 

property currentTime: DateTime read;

 

DateTime currentTime { get; }

 

var currentTime: DateTime { get{} }

 

DateTime currentTime { __get; }

 

ReadOnly Property currentTime() As DateTime

decodeAudioData    (declared in BaseAudioContext)

Asynchronously decodes audio file data contained in an ArrayBuffer. In this case, the ArrayBuffer is usually loaded from an XMLHttpRequest's response attribute after setting the responseType to arraybuffer. This method only works on complete files, not fragments of audio files.

 

method decodeAudioData(parArrayBuffer: dynamic; parDecodeSuccessCallback: dynamic; parDecodeErrorCallback: dynamic): array of Byte

 

Byte[] decodeAudioData(dynamic parArrayBuffer, dynamic parDecodeSuccessCallback, dynamic parDecodeErrorCallback)

 

func decodeAudioData(_ parArrayBuffer: dynamic, _ parDecodeSuccessCallback: dynamic, _ parDecodeErrorCallback: dynamic) -> Byte...

 

Byte[] decodeAudioData(dynamic parArrayBuffer, dynamic parDecodeSuccessCallback, dynamic parDecodeErrorCallback)

 

Function decodeAudioData(parArrayBuffer As dynamic, parDecodeSuccessCallback As dynamic, parDecodeErrorCallback As dynamic) As Byte()

Parameters:

  • parArrayBuffer:
  • parDecodeSuccessCallback:
  • parDecodeErrorCallback:

destination    (declared in BaseAudioContext)

 

property destination: AudioDestinationNode read;

 

AudioDestinationNode destination { get; }

 

var destination: AudioDestinationNode { get{} }

 

AudioDestinationNode destination { __get; }

 

ReadOnly Property destination() As AudioDestinationNode

length

 

property length: Int32 read;

 

Int32 length { get; }

 

var length: Int32 { get{} }

 

Int32 length { __get; }

 

ReadOnly Property length() As Int32

listener    (declared in BaseAudioContext)

 

property listener: AudioListener read;

 

AudioListener listener { get; }

 

var listener: AudioListener { get{} }

 

AudioListener listener { __get; }

 

ReadOnly Property listener() As AudioListener

oncomplete

 

property oncomplete: EventListener read write;

 

EventListener oncomplete { get; set; }

 

var oncomplete: EventListener { get{} set{} }

 

EventListener oncomplete { __get; __set; }

 

Property oncomplete() As EventListener

onstatechange    (declared in BaseAudioContext)

 

property onstatechange: EventListener read write;

 

EventListener onstatechange { get; set; }

 

var onstatechange: EventListener { get{} set{} }

 

EventListener onstatechange { __get; __set; }

 

Property onstatechange() As EventListener

resume

Resumes the progression of time in an audio context that has previously been suspended.

 

method resume: dynamic

 

dynamic resume()

 

func resume() -> dynamic

 

dynamic resume()

 

Function resume() As dynamic

sampleRate    (declared in BaseAudioContext)

 

property sampleRate: Double read;

 

Double sampleRate { get; }

 

var sampleRate: Double { get{} }

 

Double sampleRate { __get; }

 

ReadOnly Property sampleRate() As Double

startRendering

Starts rendering the audio, taking into account the current connections and the current scheduled changes. This page covers both the event-based version and the promise-based version.

 

method startRendering: dynamic

 

dynamic startRendering()

 

func startRendering() -> dynamic

 

dynamic startRendering()

 

Function startRendering() As dynamic

state    (declared in BaseAudioContext)

 

property state: String read;

 

String state { get; }

 

var state: String { get{} }

 

String state { __get; }

 

ReadOnly Property state() As String

suspend

Schedules a suspension of the time progression in the audio context at the specified time and returns a promise.

 

method suspend(parsuspendTime: dynamic): dynamic

 

dynamic suspend(dynamic parsuspendTime)

 

func suspend(_ parsuspendTime: dynamic) -> dynamic

 

dynamic suspend(dynamic parsuspendTime)

 

Function suspend(parsuspendTime As dynamic) As dynamic

Parameters:

  • parsuspendTime:

 

currentTime    (declared in BaseAudioContext)

 

property currentTime: DateTime read;

 

DateTime currentTime { get; }

 

var currentTime: DateTime { get{} }

 

DateTime currentTime { __get; }

 

ReadOnly Property currentTime() As DateTime

destination    (declared in BaseAudioContext)

 

property destination: AudioDestinationNode read;

 

AudioDestinationNode destination { get; }

 

var destination: AudioDestinationNode { get{} }

 

AudioDestinationNode destination { __get; }

 

ReadOnly Property destination() As AudioDestinationNode

length

 

property length: Int32 read;

 

Int32 length { get; }

 

var length: Int32 { get{} }

 

Int32 length { __get; }

 

ReadOnly Property length() As Int32

listener    (declared in BaseAudioContext)

 

property listener: AudioListener read;

 

AudioListener listener { get; }

 

var listener: AudioListener { get{} }

 

AudioListener listener { __get; }

 

ReadOnly Property listener() As AudioListener

oncomplete

 

property oncomplete: EventListener read write;

 

EventListener oncomplete { get; set; }

 

var oncomplete: EventListener { get{} set{} }

 

EventListener oncomplete { __get; __set; }

 

Property oncomplete() As EventListener

onstatechange    (declared in BaseAudioContext)

 

property onstatechange: EventListener read write;

 

EventListener onstatechange { get; set; }

 

var onstatechange: EventListener { get{} set{} }

 

EventListener onstatechange { __get; __set; }

 

Property onstatechange() As EventListener

sampleRate    (declared in BaseAudioContext)

 

property sampleRate: Double read;

 

Double sampleRate { get; }

 

var sampleRate: Double { get{} }

 

Double sampleRate { __get; }

 

ReadOnly Property sampleRate() As Double

state    (declared in BaseAudioContext)

 

property state: String read;

 

String state { get; }

 

var state: String { get{} }

 

String state { __get; }

 

ReadOnly Property state() As String

 

createAnalyser    (declared in BaseAudioContext)

Creates an AnalyserNode, which can be used to expose audio time and frequency data and for example to create data visualisations.

 

method createAnalyser: AnalyserNode

 

AnalyserNode createAnalyser()

 

func createAnalyser() -> AnalyserNode

 

AnalyserNode createAnalyser()

 

Function createAnalyser() As AnalyserNode

createBiquadFilter    (declared in BaseAudioContext)

Creates a BiquadFilterNode, which represents a second order filter configurable as several different common filter types: high-pass, low-pass, band-pass, etc

 

method createBiquadFilter: BiquadFilterNode

 

BiquadFilterNode createBiquadFilter()

 

func createBiquadFilter() -> BiquadFilterNode

 

BiquadFilterNode createBiquadFilter()

 

Function createBiquadFilter() As BiquadFilterNode

createBufferSource    (declared in BaseAudioContext)

Creates an AudioBufferSourceNode, which can be used to play and manipulate audio data contained within an AudioBuffer object. AudioBuffers are created using AudioContext.createBuffer or returned by AudioContext.decodeAudioData when it successfully decodes an audio track.

 

method createBufferSource: AudioBufferSourceNode

 

AudioBufferSourceNode createBufferSource()

 

func createBufferSource() -> AudioBufferSourceNode

 

AudioBufferSourceNode createBufferSource()

 

Function createBufferSource() As AudioBufferSourceNode

createChannelMerger    (declared in BaseAudioContext)

Creates a ChannelMergerNode, which is used to combine channels from multiple audio streams into a single audio stream.

 

method createChannelMerger(parnumberOfInputs: dynamic): ChannelMergerNode

 

ChannelMergerNode createChannelMerger(dynamic parnumberOfInputs)

 

func createChannelMerger(_ parnumberOfInputs: dynamic) -> ChannelMergerNode

 

ChannelMergerNode createChannelMerger(dynamic parnumberOfInputs)

 

Function createChannelMerger(parnumberOfInputs As dynamic) As ChannelMergerNode

Parameters:

  • parnumberOfInputs:

createChannelSplitter    (declared in BaseAudioContext)

Creates a ChannelSplitterNode, which is used to access the individual channels of an audio stream and process them separately.

 

method createChannelSplitter(parnumberOfOutputs: dynamic): ChannelSplitterNode

 

ChannelSplitterNode createChannelSplitter(dynamic parnumberOfOutputs)

 

func createChannelSplitter(_ parnumberOfOutputs: dynamic) -> ChannelSplitterNode

 

ChannelSplitterNode createChannelSplitter(dynamic parnumberOfOutputs)

 

Function createChannelSplitter(parnumberOfOutputs As dynamic) As ChannelSplitterNode

Parameters:

  • parnumberOfOutputs:

createConstantSource    (declared in BaseAudioContext)

Creates a ConstantSourceNode object, which is an audio source that continuously outputs a monaural (one-channel) sound signal whose samples all have the same value.

 

method createConstantSource: ConstantSourceNode

 

ConstantSourceNode createConstantSource()

 

func createConstantSource() -> ConstantSourceNode

 

ConstantSourceNode createConstantSource()

 

Function createConstantSource() As ConstantSourceNode

createConvolver    (declared in BaseAudioContext)

Creates a ConvolverNode, which can be used to apply convolution effects to your audio graph, for example a reverberation effect.

 

method createConvolver: ConvolverNode

 

ConvolverNode createConvolver()

 

func createConvolver() -> ConvolverNode

 

ConvolverNode createConvolver()

 

Function createConvolver() As ConvolverNode

createDelay    (declared in BaseAudioContext)

Creates a DelayNode, which is used to delay the incoming audio signal by a certain amount. This node is also useful to create feedback loops in a Web Audio API graph.

 

method createDelay(parmaxDelayTime: dynamic): DelayNode

 

DelayNode createDelay(dynamic parmaxDelayTime)

 

func createDelay(_ parmaxDelayTime: dynamic) -> DelayNode

 

DelayNode createDelay(dynamic parmaxDelayTime)

 

Function createDelay(parmaxDelayTime As dynamic) As DelayNode

Parameters:

  • parmaxDelayTime:

createDynamicsCompressor    (declared in BaseAudioContext)

Creates a DynamicsCompressorNode, which can be used to apply acoustic compression to an audio signal.

 

method createDynamicsCompressor: DynamicsCompressorNode

 

DynamicsCompressorNode createDynamicsCompressor()

 

func createDynamicsCompressor() -> DynamicsCompressorNode

 

DynamicsCompressorNode createDynamicsCompressor()

 

Function createDynamicsCompressor() As DynamicsCompressorNode

createGain    (declared in BaseAudioContext)

Creates a GainNode, which can be used to control the overall volume of the audio graph.

 

method createGain: GainNode

 

GainNode createGain()

 

func createGain() -> GainNode

 

GainNode createGain()

 

Function createGain() As GainNode

createIIRFilter    (declared in BaseAudioContext)

Creates an IIRFilterNode, which represents a second order filter configurable as several different common filter types.

 

method createIIRFilter(parfeedforward: dynamic; parfeedback: dynamic): IIRFilterNode

 

IIRFilterNode createIIRFilter(dynamic parfeedforward, dynamic parfeedback)

 

func createIIRFilter(_ parfeedforward: dynamic, _ parfeedback: dynamic) -> IIRFilterNode

 

IIRFilterNode createIIRFilter(dynamic parfeedforward, dynamic parfeedback)

 

Function createIIRFilter(parfeedforward As dynamic, parfeedback As dynamic) As IIRFilterNode

Parameters:

  • parfeedforward:
  • parfeedback:

createOscillator    (declared in BaseAudioContext)

Creates an OscillatorNode, a source representing a periodic waveform. It basically generates a tone.

 

method createOscillator: OscillatorNode

 

OscillatorNode createOscillator()

 

func createOscillator() -> OscillatorNode

 

OscillatorNode createOscillator()

 

Function createOscillator() As OscillatorNode

createPanner    (declared in BaseAudioContext)

Creates a PannerNode, which is used to spatialise an incoming audio stream in 3D space.

 

method createPanner: PannerNode

 

PannerNode createPanner()

 

func createPanner() -> PannerNode

 

PannerNode createPanner()

 

Function createPanner() As PannerNode

createPeriodicWave    (declared in BaseAudioContext)

Creates a PeriodicWave, used to define a periodic waveform that can be used to determine the output of an OscillatorNode.

 

method createPeriodicWave(parreal: dynamic; parimag: dynamic): PeriodicWave

 

PeriodicWave createPeriodicWave(dynamic parreal, dynamic parimag)

 

func createPeriodicWave(_ parreal: dynamic, _ parimag: dynamic) -> PeriodicWave

 

PeriodicWave createPeriodicWave(dynamic parreal, dynamic parimag)

 

Function createPeriodicWave(parreal As dynamic, parimag As dynamic) As PeriodicWave

Parameters:

  • parreal:
  • parimag:

createStereoPanner    (declared in BaseAudioContext)

Creates a StereoPannerNode, which can be used to apply stereo panning to an audio source.

 

method createStereoPanner: StereoPannerNode

 

StereoPannerNode createStereoPanner()

 

func createStereoPanner() -> StereoPannerNode

 

StereoPannerNode createStereoPanner()

 

Function createStereoPanner() As StereoPannerNode

createWaveShaper    (declared in BaseAudioContext)

Creates a WaveShaperNode, which is used to implement non-linear distortion effects.

 

method createWaveShaper: WaveShaperNode

 

WaveShaperNode createWaveShaper()

 

func createWaveShaper() -> WaveShaperNode

 

WaveShaperNode createWaveShaper()

 

Function createWaveShaper() As WaveShaperNode

decodeAudioData    (declared in BaseAudioContext)

Asynchronously decodes audio file data contained in an ArrayBuffer. In this case, the ArrayBuffer is usually loaded from an XMLHttpRequest's response attribute after setting the responseType to arraybuffer. This method only works on complete files, not fragments of audio files.

 

method decodeAudioData(parArrayBuffer: dynamic; parDecodeSuccessCallback: dynamic; parDecodeErrorCallback: dynamic): array of Byte

 

Byte[] decodeAudioData(dynamic parArrayBuffer, dynamic parDecodeSuccessCallback, dynamic parDecodeErrorCallback)

 

func decodeAudioData(_ parArrayBuffer: dynamic, _ parDecodeSuccessCallback: dynamic, _ parDecodeErrorCallback: dynamic) -> Byte...

 

Byte[] decodeAudioData(dynamic parArrayBuffer, dynamic parDecodeSuccessCallback, dynamic parDecodeErrorCallback)

 

Function decodeAudioData(parArrayBuffer As dynamic, parDecodeSuccessCallback As dynamic, parDecodeErrorCallback As dynamic) As Byte()

Parameters:

  • parArrayBuffer:
  • parDecodeSuccessCallback:
  • parDecodeErrorCallback:

resume

Resumes the progression of time in an audio context that has previously been suspended.

 

method resume: dynamic

 

dynamic resume()

 

func resume() -> dynamic

 

dynamic resume()

 

Function resume() As dynamic

startRendering

Starts rendering the audio, taking into account the current connections and the current scheduled changes. This page covers both the event-based version and the promise-based version.

 

method startRendering: dynamic

 

dynamic startRendering()

 

func startRendering() -> dynamic

 

dynamic startRendering()

 

Function startRendering() As dynamic

suspend

Schedules a suspension of the time progression in the audio context at the specified time and returns a promise.

 

method suspend(parsuspendTime: dynamic): dynamic

 

dynamic suspend(dynamic parsuspendTime)

 

func suspend(_ parsuspendTime: dynamic) -> dynamic

 

dynamic suspend(dynamic parsuspendTime)

 

Function suspend(parsuspendTime As dynamic) As dynamic

Parameters:

  • parsuspendTime: